Class LayerMemoryReport.Builder
- java.lang.Object
-
- org.deeplearning4j.nn.conf.memory.LayerMemoryReport.Builder
-
- Enclosing class:
- LayerMemoryReport
public static class LayerMemoryReport.Builder extends Object
-
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description LayerMemoryReportbuild()LayerMemoryReport.BuildercacheMemory(long cacheModeMemoryFixed, long cacheModeMemoryVariablePerEx)Reports the cached/cacheable memory requirements.LayerMemoryReport.BuildercacheMemory(Map<CacheMode,Long> cacheModeMemoryFixed, Map<CacheMode,Long> cacheModeMemoryVariablePerEx)Reports the cached/cacheable memory requirements.LayerMemoryReport.BuilderstandardMemory(long parameterSize, long updaterStateSize)Report the standard memoryLayerMemoryReport.BuilderworkingMemory(long fixedInference, long variableInferencePerEx, long fixedTrain, long variableTrainPerEx)Report the working memory size, for both inference and trainingLayerMemoryReport.BuilderworkingMemory(long fixedInference, long variableInferencePerEx, Map<CacheMode,Long> fixedTrain, Map<CacheMode,Long> variableTrainPerEx)Report the working memory requirements, for both inference and training.
-
-
-
Method Detail
-
standardMemory
public LayerMemoryReport.Builder standardMemory(long parameterSize, long updaterStateSize)
Report the standard memory- Parameters:
parameterSize- Number of parametersupdaterStateSize- Size for the updater array
-
workingMemory
public LayerMemoryReport.Builder workingMemory(long fixedInference, long variableInferencePerEx, long fixedTrain, long variableTrainPerEx)
Report the working memory size, for both inference and training- Parameters:
fixedInference- Number of elements used for inference ( independent of minibatch size)variableInferencePerEx- Number of elements used for inference, for each examplefixedTrain- Number of elements used for training (independent of minibatch size)variableTrainPerEx- Number of elements used for training, for each example
-
workingMemory
public LayerMemoryReport.Builder workingMemory(long fixedInference, long variableInferencePerEx, Map<CacheMode,Long> fixedTrain, Map<CacheMode,Long> variableTrainPerEx)
Report the working memory requirements, for both inference and training. As noted inMemoryReportWorking memory is memory That will be allocated in a ND4J workspace, or can be garbage collected at any points after the method returns.- Parameters:
fixedInference- Number of elements of working memory used for inference (independent of minibatch size)variableInferencePerEx- Number of elements of working memory used for inference, for each examplefixedTrain- Number of elements of working memory used for training (independent of minibatch size), for each cache modevariableTrainPerEx- Number of elements of working memory used for training, for each example, for each cache mode
-
cacheMemory
public LayerMemoryReport.Builder cacheMemory(long cacheModeMemoryFixed, long cacheModeMemoryVariablePerEx)
Reports the cached/cacheable memory requirements. This method assumes the caseload memory is the same for all cases, i.e., typically used with zeros (Layers that do not use caching)- Parameters:
cacheModeMemoryFixed- Number of elements of cache memory, independent of the mini batch sizecacheModeMemoryVariablePerEx- Number of elements of cache memory, for each example
-
cacheMemory
public LayerMemoryReport.Builder cacheMemory(Map<CacheMode,Long> cacheModeMemoryFixed, Map<CacheMode,Long> cacheModeMemoryVariablePerEx)
Reports the cached/cacheable memory requirements.- Parameters:
cacheModeMemoryFixed- Number of elements of cache memory, independent of the mini batch sizecacheModeMemoryVariablePerEx- Number of elements of cache memory, for each example
-
build
public LayerMemoryReport build()
-
-