假设您知道要在GPU上存储的元素数量,您可以轻松计算存储这些元素所需的内存量。
一个简单的例子:
import numpy as np
import theano.tensor as T
T.config.floatX = 'float32'
dataPoints = np.random.random((5000, 256 * 256)).astype(T.config.floatX)
sizeinGBs = 5000 * 256 * 256 * 4 / 1024. / 1024 / 1024 + (some small over-head constant)
print "Data will need %2f GBs of free memory" % sizeInGB
假设开销常量为0,则会输出:
>>> Data will need 1.22 GBs of free memory
如果你正在使用NVIDIA图形卡,并在计算机上安装了CUDA,那么你可以轻松地使用以下代码获取GPU上的总空闲内存量:
import theano.sandbox.cuda.basic_ops as sbcuda
import numpy as np
import theano.tensor as T
T.config.floatX = 'float32'
GPUFreeMemoryInBytes = sbcuda.cuda_ndarray.cuda_ndarray.mem_info()[0]
freeGPUMemInGBs = GPUFreeMemoryInBytes/1024./1024/1024
print "Your GPU has %s GBs of free memory" % str(freeGPUMemInGBs)
testData = shared(np.random.random((5000, 256 * 256)).astype(T.config.floatX), borrow = True)
print "The tasks above used %s GBs of your GPU memory. The available memory is %s GBs" % (str(freeGPUMemInGBs - GPUFreeMemoryInBytes/1024./1024/1024), str(GPUFreeMemoryInBytes/1024./1024/1024))
然后输出的格式如下(适用于我的计算机):
>>> Your GPU has 11.2557678223 GBs of free memory
>>> The tasks above used 1.22077941895 GBs of your GPU memory. The available memory is 10.0349884033 GBs
通过监控空闲内存并计算您的模型/数据的大小,您可以更好地使用GPU内存。但是,请注意内存碎片化问题,因为它可能会意外导致MemoryError
。