NuPIC
0.2.7.dev0
Numenta Platform for Intelligent Computing

Class implementing the temporal pooler algorithm as described in the published Cortical Learning Algorithm documentation. More...
Public Member Functions  
def  __setstate__ 
Set the state of ourself from a serialized state.  
def  saveToFile 
Save Cells4 state to this file.  
def  loadFromFile 
Load Cells4 state from this file.  
def  __getattr__ 
Patch getattr so that we can catch the first access to 'cells' and load. More...  
def  compute 
Handle one compute, possibly learning. More...  
def  inferPhase2 
This calls phase 2 of inference (used in multistep prediction).  
def  reset 
Reset the state of all cells. More...  
def  finishLearning 
Called when learning has been completed. More...  
def  trimSegments 
This method deletes all synapses where permanence value is strictly less than self.connectedPerm. More...  
def  printSegment 
The following print functions for debugging.  
def  slowIsSegmentActive 
A segment is active if it has >= activationThreshold connected synapses that are active due to infActiveState.  
def  getAvgLearnedSeqLength 
Return our moving average of learned sequence length.  
def  getColCellIdx 
Get column and cell within column from a global cell index. More...  
def  getSegmentOnCell 
Return segment number segIdx on cell (c,i). More...  
def  getNumSegments 
Return the total number of segments. More...  
def  getNumSynapses 
Return the total number of synapses. More...  
def  getNumSegmentsInCell 
Return the total number of segments in cell (c,i)  
def  getSegmentInfo 
Returns information about the distribution of segments, synapses and permanence values in the current TP. More...  
def  getActiveSegment 
For a given cell, return the segment with the strongest connected activation, i.e. More...  
def  getBestMatchingCell 
Find weakly activated cell in column. More...  
def  getLeastAllocatedCell 
For the given column, return the cell with the fewest number of segments. More...  
def  isSegmentActive 
The following methods are implemented in the base class but should never be called in this implementation.  
Public Member Functions inherited from TP  
def  __init__ 
Construct the TP. More...  
def  saveToFile 
Implemented in TP10X2.TP10X2.saveToFile.  
def  loadFromFile 
Implemented in TP10X2.TP10X2.loadFromFile.  
def  reset 
Reset the state of all cells. More...  
def  resetStats 
Reset the learning and inference stats. More...  
def  getStats 
Return the current learning and inference stats. More...  
def  printState 
Print an integer array that is the same shape as activeState. More...  
def  printConfidence 
Print a floating point array that is the same shape as activeState. More...  
def  printColConfidence 
Print up to maxCols number from a flat floating point array. More...  
def  printStates 
def  printOutput 
def  printInput 
def  printParameters 
Print the parameter settings for the TP.  
def  printActiveIndices 
Print the list of [column, cellIdx] indices for each of the active cells in state. More...  
def  printComputeEnd 
Called at the end of inference to print out various diagnostic information based on the current verbosity level. More...  
def  printSegmentUpdates 
def  printCell 
def  printCells 
def  getNumSegmentsInCell 
def  getNumSynapses 
def  getNumStrongSynapses 
def  getNumStrongSynapsesPerTimeSlot 
def  getNumSynapsesPerSegmentMax 
def  getNumSynapsesPerSegmentAvg 
def  getNumSegments 
def  getNumCells 
def  getSegmentOnCell 
def  addToSegmentUpdates 
Store a dated potential segment update. More...  
def  removeSegmentUpdate 
Remove a segment update (called when seg update expires or is processed) More...  
def  computeOutput 
Computes output for both learning and inference. More...  
def  getActiveState 
Return the current active state. More...  
def  getPredictedState 
Return a numpy array, predictedCells, representing the current predicted state. More...  
def  predict 
This function gives the future predictions for <nSteps> timesteps starting from the current TP state. More...  
def  getAvgLearnedSeqLength 
def  inferBacktrack 
This "backtracks" our inference state, trying to see if we can lock onto the current set of inputs by assuming the sequence started up to N steps ago on start cells. More...  
def  inferPhase1 
Update the inference active state from the last set of predictions and the current bottomup. More...  
def  inferPhase2 
Phase 2 for the inference state. More...  
def  updateInferenceState 
Update the inference state. More...  
def  learnBacktrack 
This "backtracks" our learning state, trying to see if we can lock onto the current set of inputs by assuming the sequence started up to N steps ago on start cells. More...  
def  learnPhase1 
Compute the learning active state given the predicted state and the bottomup input. More...  
def  learnPhase2 
Compute the predicted segments given the current set of active cells. More...  
def  updateLearningState 
Update the learning state. More...  
def  compute 
Handle one compute, possibly learning. More...  
def  infer 
def  learn 
def  updateSegmentDutyCycles 
This gets called on every compute. More...  
def  columnConfidences 
Compute the column confidences given the cell confidences. More...  
def  topDownCompute 
Topdown compute  generate expected input given output of the TP. More...  
def  trimSegmentsInCell 
This method goes through a list of segments for a given cell and deletes all synapses whose permanence is less than minPermanence and deletes any segments that have less than minNumSyns synapses remaining. More...  
def  trimSegments 
This method deletes all synapses whose permanence is less than minPermanence and deletes any segments that have less than minNumSyns synapses remaining. More...  
def  cleanUpdatesList 
Removes any update that would be for the given col, cellIdx, segIdx. More...  
def  finishLearning 
Called when learning has been completed. More...  
def  checkPrediction2 
This function will replace checkPrediction. More...  
def  isSegmentActive 
A segment is active if it has >= activationThreshold connected synapses that are active due to activeState. More...  
def  getSegmentActivityLevel 
This routine computes the activity level of a segment given activeState. More...  
def  getBestMatchingCell 
Find weakly activated cell in column with at least minThreshold active synapses. More...  
def  getBestMatchingSegment 
For the given cell, find the segment with the largest number of active synapses. More...  
def  getCellForNewSegment 
Return the index of a cell in this column which is a good candidate for adding a new segment. More...  
def  getSegmentActiveSynapses 
Return a segmentUpdate data structure containing a list of proposed changes to segment s. More...  
def  chooseCellsToLearnFrom 
Choose n random cells to learn from. More...  
def  processSegmentUpdates 
Go through the list of accumulated segment updates and process them as follows: More...  
def  adaptSegment 
This function applies segment update information to a segment in a cell. More...  
def  getSegmentInfo 
Returns information about the distribution of segments, synapses and permanence values in the current TP. More...  
Additional Inherited Members  
Public Attributes inherited from TP  
version  
numberOfCols  
cellsPerColumn  
initialPerm  
connectedPerm  
minThreshold  
newSynapseCount  
permanenceInc  
permanenceDec  
permanenceMax  
globalDecay  
activationThreshold  
doPooling  
Allows to turn off pooling.  
segUpdateValidDuration  
burnIn  
Used for evaluating the prediction score.  
collectStats  
If true, collect training/inference stats.  
seed  
verbosity  
pamLength  
maxAge  
maxInfBacktrack  
maxLrnBacktrack  
maxSeqLength  
maxSegmentsPerCell  
maxSynapsesPerSegment  
outputType  
activeColumns  
cells  
Cells are indexed by column and index in the column Every self.cells[column][index] contains a list of segments Each segment is a structure of class Segment.  
lrnIterationIdx  
iterationIdx  
segID  
unique segment id, so we can put segments in hashes  
currentOutput  
pamCounter  
pamCounter gets reset to pamLength whenever we detect that the learning state is making good predictions (at least half the columns predicted). More...  
collectSequenceStats  
If True, the TP will compute a signature for each sequence.  
resetCalled  
This gets set when we receive a reset and cleared on the first compute following a reset. More...  
avgInputDensity  
We keep track of the average input density here.  
learnedSeqLength  
Keeps track of the length of the sequence currently being learned. More...  
avgLearnedSeqLength  
Keeps track of the moving average of all learned sequence length. More...  
segmentUpdates  
We store the lists of segments updates, per cell, so that they can be applied later during learning, when the cell gets bottomup activation. More...  
cellConfidence  
colConfidence  
lrnActiveState  
infActiveState  
lrnPredictedState  
infPredictedState  
Class implementing the temporal pooler algorithm as described in the published Cortical Learning Algorithm documentation.
The implementation here attempts to closely match the pseudocode in the documentation. This implementation does contain several additional bells and whistles such as a column confidence measure.
def __getattr__  (  self,  
name  
) 
Patch getattr so that we can catch the first access to 'cells' and load.
This function is only called when we try to access an attribute that doesn't exist. We purposely make sure that "self.cells" doesn't exist after unpickling so that we'll hit this, then we can load it on the first access.
If this is called at any other time, it will raise an AttributeError. That's because:
def compute  (  self,  
bottomUpInput,  
enableLearn,  
computeInfOutput = None 

) 
Handle one compute, possibly learning.
By default, we don't compute the inference output when learning because it slows things down, but you can override this by passing in True for computeInfOutput
def finishLearning  (  self  ) 
Called when learning has been completed.
This method just calls trimSegments. (finishLearning is here for backward compatibility)
def getActiveSegment  (  self,  
c,  
i,  
timeStep  
) 
For a given cell, return the segment with the strongest connected activation, i.e.
sum up the activations of the connected synapses of the segments only. That is, a segment is active only if it has enough connected synapses.
def getBestMatchingCell  (  self,  
c,  
timeStep,  
learnState = False 

) 
Find weakly activated cell in column.
Returns index and segment of most activated segment above minThreshold.
def getColCellIdx  (  self,  
idx  
) 
Get column and cell within column from a global cell index.
The global index is idx = colIdx * nCellsPerCol() + cellIdxInCol This method returns (colIdx, cellIdxInCol)
def getLeastAllocatedCell  (  self,  
c  
) 
For the given column, return the cell with the fewest number of segments.
def getNumSegments  (  self  ) 
Return the total number of segments.
def getNumSynapses  (  self  ) 
Return the total number of synapses.
def getSegmentInfo  (  self,  
collectActiveData = False 

) 
Returns information about the distribution of segments, synapses and permanence values in the current TP.
If requested, also returns information regarding the number of currently active segments and synapses.
The method returns the following tuple:
( nSegments, # total number of segments nSynapses, # total number of synapses nActiveSegs, # total no. of active segments nActiveSynapses, # total no. of active synapses distSegSizes, # a dict where d[n] = number of segments with n synapses distNSegsPerCell, # a dict where d[n] = number of cells with n segments distPermValues, # a dict where d[p] = number of synapses with perm = p/10 distAges, # a list of tuples (ageRange, numSegments) )
nActiveSegs and nActiveSynapses are 0 if collectActiveData is False
def getSegmentOnCell  (  self,  
c,  
i,  
segIdx  
) 
Return segment number segIdx on cell (c,i).
Returns the segment as following list: [ [segIdx, sequenceSegmentFlag, positive activations, total activations, last active iteration], [col1, idx1, perm1], [col2, idx2, perm2], ... ]
def reset  (  self  ) 
Reset the state of all cells.
This is normally used between sequences while training. All internal states are reset to 0.
def trimSegments  (  self,  
minPermanence = None , 

minNumSyns = None 

) 
This method deletes all synapses where permanence value is strictly less than self.connectedPerm.
It also deletes all segments where the number of connected synapses is strictly less than self.activationThreshold. Returns the number of segments and synapses removed. This often done after formal learning has completed so that subsequence inference runs faster.
minPermanence: Any syn whose permamence is 0 or < minPermanence will be deleted. If None is passed in, then self.connectedPerm is used. minNumSyns: Any segment with less than minNumSyns synapses remaining in it will be deleted. If None is passed in, then self.activationThreshold is used. retval: (numSegsRemoved, numSynsRemoved)