NuPIC  0.2.1.dev0
Numenta Platform for Intelligent Computing
 All Classes Namespaces Files Functions Variables Pages
Public Member Functions | List of all members
TPTrivial Class Reference

Class implementing a trivial temporal pooler algorithm. More...

Inheritance diagram for TPTrivial:
TP

Public Member Functions

def __init__
 We use the same keyword arguments as TP()
 
def __setstate__
 Set the state of ourself from a serialized state.
 
def __getattr__
 Patch getattr so that we can catch the first access to 'cells' and load. More...
 
def infer
 Do one iteration of the temporal pooler inference. More...
 
def learn
 Do one iteration of the temporal pooler learning. More...
 
def reset
 Reset the state of all cells. More...
 
def trimSegments
 This method does nothing in this implementation. More...
 
def printSegment
 The following print functions for debugging.
 
def getColCellIdx
 Get column and cell within column from a global cell index. More...
 
def getSegmentOnCell
 Return segment number segIdx on cell (c,i). More...
 
def getNumSegments
 Return the total number of segments. More...
 
def getNumSynapses
 Return the total number of synapses. More...
 
def getNumSegmentsInCell
 Return the total number of segments in cell (c,i)
 
def getSegmentInfo
 Returns information about the distribution of segments, synapses and permanence values in the current TP. More...
 
def getActiveSegment
 The following methods are implemented in the base class but should never be called in this implementation. More...
 
def getBestMatchingCell
 Find weakly activated cell in column. More...
 
def getLeastAllocatedCell
 For the given column, return the cell with the fewest number of segments. More...
 
- Public Member Functions inherited from TP
def __init__
 Construct the TP. More...
 
def saveToFile
 Implemented in TP10X2.TP10X2.saveToFile.
 
def loadFromFile
 Implemented in TP10X2.TP10X2.loadFromFile.
 
def reset
 Reset the state of all cells. More...
 
def resetStats
 Reset the learning and inference stats. More...
 
def getStats
 Return the current learning and inference stats. More...
 
def printState
 Print an integer array that is the same shape as activeState. More...
 
def printConfidence
 Print a floating point array that is the same shape as activeState. More...
 
def printColConfidence
 Print up to maxCols number from a flat floating point array. More...
 
def printStates
 
def printOutput
 
def printInput
 
def printParameters
 Print the parameter settings for the TP.
 
def printActiveIndices
 Print the list of [column, cellIdx] indices for each of the active cells in state. More...
 
def printComputeEnd
 Called at the end of inference to print out various diagnostic information based on the current verbosity level. More...
 
def printSegmentUpdates
 
def printCell
 
def printCells
 
def getNumSegmentsInCell
 
def getNumSynapses
 
def getNumStrongSynapses
 
def getNumStrongSynapsesPerTimeSlot
 
def getNumSynapsesPerSegmentMax
 
def getNumSynapsesPerSegmentAvg
 
def getNumSegments
 
def getNumCells
 
def getSegmentOnCell
 
def addToSegmentUpdates
 Store a dated potential segment update. More...
 
def removeSegmentUpdate
 Remove a segment update (called when seg update expires or is processed) More...
 
def computeOutput
 Computes output for both learning and inference. More...
 
def getActiveState
 Return the current active state. More...
 
def getPredictedState
 Return a numpy array, predictedCells, representing the current predicted state. More...
 
def predict
 This function gives the future predictions for <nSteps> timesteps starting from the current TP state. More...
 
def getAvgLearnedSeqLength
 
def inferBacktrack
 This "backtracks" our inference state, trying to see if we can lock onto the current set of inputs by assuming the sequence started up to N steps ago on start cells. More...
 
def inferPhase1
 Update the inference active state from the last set of predictions and the current bottom-up. More...
 
def inferPhase2
 Phase 2 for the inference state. More...
 
def updateInferenceState
 Update the inference state. More...
 
def learnBacktrack
 This "backtracks" our learning state, trying to see if we can lock onto the current set of inputs by assuming the sequence started up to N steps ago on start cells. More...
 
def learnPhase1
 Compute the learning active state given the predicted state and the bottom-up input. More...
 
def learnPhase2
 Compute the predicted segments given the current set of active cells. More...
 
def updateLearningState
 Update the learning state. More...
 
def compute
 Handle one compute, possibly learning. More...
 
def infer
 
def learn
 
def updateSegmentDutyCycles
 This gets called on every compute. More...
 
def columnConfidences
 Compute the column confidences given the cell confidences. More...
 
def topDownCompute
 Top-down compute - generate expected input given output of the TP. More...
 
def trimSegmentsInCell
 This method goes through a list of segments for a given cell and deletes all synapses whose permanence is less than minPermanence and deletes any segments that have less than minNumSyns synapses remaining. More...
 
def trimSegments
 This method deletes all synapses whose permanence is less than minPermanence and deletes any segments that have less than minNumSyns synapses remaining. More...
 
def cleanUpdatesList
 Removes any update that would be for the given col, cellIdx, segIdx. More...
 
def finishLearning
 Called when learning has been completed. More...
 
def checkPrediction2
 This function will replace checkPrediction. More...
 
def isSegmentActive
 A segment is active if it has >= activationThreshold connected synapses that are active due to activeState. More...
 
def getSegmentActivityLevel
 This routine computes the activity level of a segment given activeState. More...
 
def getBestMatchingCell
 Find weakly activated cell in column with at least minThreshold active synapses. More...
 
def getBestMatchingSegment
 For the given cell, find the segment with the largest number of active synapses. More...
 
def getCellForNewSegment
 Return the index of a cell in this column which is a good candidate for adding a new segment. More...
 
def getSegmentActiveSynapses
 Return a segmentUpdate data structure containing a list of proposed changes to segment s. More...
 
def chooseCellsToLearnFrom
 Choose n random cells to learn from. More...
 
def processSegmentUpdates
 Go through the list of accumulated segment updates and process them as follows: More...
 
def adaptSegment
 This function applies segment update information to a segment in a cell. More...
 
def getSegmentInfo
 Returns information about the distribution of segments, synapses and permanence values in the current TP. More...
 

Additional Inherited Members

- Public Attributes inherited from TP
 version
 
 numberOfCols
 
 cellsPerColumn
 
 initialPerm
 
 connectedPerm
 
 minThreshold
 
 newSynapseCount
 
 permanenceInc
 
 permanenceDec
 
 permanenceMax
 
 globalDecay
 
 activationThreshold
 
 doPooling
 Allows to turn off pooling.
 
 segUpdateValidDuration
 
 burnIn
 Used for evaluating the prediction score.
 
 collectStats
 If true, collect training/inference stats.
 
 seed
 
 verbosity
 
 pamLength
 
 maxAge
 
 maxInfBacktrack
 
 maxLrnBacktrack
 
 maxSeqLength
 
 maxSegmentsPerCell
 
 maxSynapsesPerSegment
 
 outputType
 
 activeColumns
 
 cells
 Cells are indexed by column and index in the column Every self.cells[column][index] contains a list of segments Each segment is a structure of class Segment.
 
 lrnIterationIdx
 
 iterationIdx
 
 segID
 unique segment id, so we can put segments in hashes
 
 currentOutput
 
 pamCounter
 pamCounter gets reset to pamLength whenever we detect that the learning state is making good predictions (at least half the columns predicted). More...
 
 trivialPredictor
 
 collectSequenceStats
 If True, the TP will compute a signature for each sequence.
 
 resetCalled
 This gets set when we receive a reset and cleared on the first compute following a reset. More...
 
 avgInputDensity
 We keep track of the average input density here.
 
 learnedSeqLength
 Keeps track of the length of the sequence currently being learned. More...
 
 avgLearnedSeqLength
 Keeps track of the moving average of all learned sequence length. More...
 
 segmentUpdates
 We store the lists of segments updates, per cell, so that they can be applied later during learning, when the cell gets bottom-up activation. More...
 
 cellConfidence
 
 colConfidence
 
 lrnActiveState
 
 infActiveState
 
 lrnPredictedState
 
 infPredictedState
 

Detailed Description

Class implementing a trivial temporal pooler algorithm.

This temporal pooler will measure the following input statistics: how often each column is active, and average input density.

It will various trivial predictions depending on the predictionMethod parameter:

random : output a random set of columns, maintaining average density zeroth : output the most frequent columns, maintaining average density all : always predict all the columns last : always predict the last input lots : output the most frequent columns, maintaining 5*average density

output random or zero'th order predictions.

The main purpose is to provide baseline data for comparison to other temporal pooler implementations.

Member Function Documentation

def __getattr__ (   self,
  name 
)

Patch getattr so that we can catch the first access to 'cells' and load.

This function is only called when we try to access an attribute that doesn't exist. We purposely make sure that "self.cells" doesn't exist after unpickling so that we'll hit this, then we can load it on the first access.

If this is called at any other time, it will raise an AttributeError. That's because:

  • If 'name' is "cells", after the first call, self._realCells won't exist so we'll get an implicit AttributeError.
  • If 'name' isn't "cells", I'd expect our super wouldn't have getattr, so we'll raise our own Attribute error. If the super did get getattr, we'll just return what it gives us.
def getActiveSegment (   self,
  c,
  i,
  timeStep 
)

The following methods are implemented in the base class but should never be called in this implementation.

For a given cell, return the segment with the strongest connected activation, i.e. sum up the activations of the connected synapses of the segments only. That is, a segment is active only if it has enough connected synapses.

def getBestMatchingCell (   self,
  c,
  timeStep,
  learnState = False 
)

Find weakly activated cell in column.

Returns index and segment of most activated segment above minThreshold.

def getColCellIdx (   self,
  idx 
)

Get column and cell within column from a global cell index.

The global index is idx = colIdx * nCellsPerCol() + cellIdxInCol This method returns (colIdx, cellIdxInCol)

def getLeastAllocatedCell (   self,
  c 
)

For the given column, return the cell with the fewest number of segments.

def getNumSegments (   self)

Return the total number of segments.

def getNumSynapses (   self)

Return the total number of synapses.

def getSegmentInfo (   self,
  collectActiveData = False 
)

Returns information about the distribution of segments, synapses and permanence values in the current TP.

If requested, also returns information regarding the number of currently active segments and synapses.

The method returns the following tuple:

( nSegments, # total number of segments nSynapses, # total number of synapses nActiveSegs, # total no. of active segments nActiveSynapses, # total no. of active synapses distSegSizes, # a dict where d[n] = number of segments with n synapses distNSegsPerCell, # a dict where d[n] = number of cells with n segments distPermValues, # a dict where d[p] = number of synapses with perm = p/10 )

nActiveSegs and nActiveSynapses are 0 if collectActiveData is False

def getSegmentOnCell (   self,
  c,
  i,
  segIdx 
)

Return segment number segIdx on cell (c,i).

Returns the segment as following list: [ [segIdx, sequenceSegmentFlag, frequency], [col1, idx1, perm1], [col2, idx2, perm2], ... ]

def infer (   self,
  bottomUpInput 
)

Do one iteration of the temporal pooler inference.

Parameters:

bottomUpInput: Current bottom-up input, dense retval: ?

def learn (   self,
  bottomUpInput 
)

Do one iteration of the temporal pooler learning.

Parameters:

bottomUpInput: Current bottom-up input, dense retval: ?

def reset (   self)

Reset the state of all cells.

This is normally used between sequences while training. All internal states are reset to 0.

def trimSegments (   self)

This method does nothing in this implementation.


The documentation for this class was generated from the following file: