Created
March 13, 2011 10:13
-
-
Save iscadar/868004 to your computer and use it in GitHub Desktop.
This is a simple function that generates a variety of network connectivities and consequently architectures anywhere from a three-layer feedforward network to a neural pool of dynamics. It supports recurrent connections, lateral and self-connections, feed
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
##A generic network connectivity architecture | |
##by Alexandros Kourkoulas-Chondrorizos | |
##v0.2 | |
##This is a simple function that generates a variety | |
##of network connectivities and consequently architectures | |
##anywhere from a three-layer feedforward network | |
##to a neural pool of dynamics. It supports recurrent | |
##connections, lateral and self-connections, feedforward | |
##connections (obviously), any degree of sparsity (0 to | |
##100% connectivity) and both excitatory and inhibitory connections. | |
##This function takes three arguments: N, C and D. N is a | |
##1 by n list of integers larger than 1, where n is an | |
##integer from 1 to 3 and signifies the number of layers | |
##in the architecture of the network. Each value in N is the | |
##number of neurons in each layer. C and D are L by L lists | |
##where L is the number of layers in the architecture. C stores | |
##the synaptic strength of each set of connections and D stores | |
##the sparsity of each set of connections. This function builds | |
##subsets of connections with individual strengths and sparsities | |
##which connect any one layer to another. The subsets are then | |
##concatenated into a big matrix which describes the connectivity | |
##and the overall architecture of the network. | |
## | |
## Example: This generates a simple three-layer feedforward net | |
## >>>N=[2, 3, 2]#neurons per layer | |
## >>>C=[[1,2,3],[4,5,4],[3,2,1]]#connection strength | |
## >>>D=[[.5,0,0],[0,.7,0],[0,0,.5]]#connection density | |
## >>>S=netgen(N,C,D) | |
## >>>print S | |
##[[ 0. 0.96162857 0. 0. 0. 0. 0. ] | |
## [ 0.15385049 0. 0. 0. 0. 0. 0. ] | |
## [ 0. 0. 0. 0. 0. 0. 0. ] | |
## [ 0. 0. 0.17980379 0.09374685 0. 0. 0. ] | |
## [ 0. 0. 2.41982174 1.72593395 0. 0. 0. ] | |
## [ 0. 0. 0. 0. 0. 0.99835747 0. ] | |
## [ 0. 0. 0. 0. 0. 0.98608675 0.985044 ]] | |
from scipy import * | |
from secnet import sprand | |
def netgen(N,C,D): #generic network architecture | |
L=len(N) #number of layers | |
K=sum(N) #total number of neurons | |
# check parameters here | |
if not(isinstance(N,list)) or L < 1.0 or L > 3.0 or len(shape(N))!=1 or K<2: | |
raise ValueError('N should be a 1 by n list of integers >1, where n is an integer from 1 to 3') | |
if shape(C)[0]!=shape(C)[1]!=L or shape(D)[0]!=shape(D)[1]!=L: | |
raise ValueError('C and D should both be L by L lists, where L is the number of layers of the network') | |
S=zeros((K,K)) #empty synaptic matrix | |
for i in range(L): | |
for j in range(L): | |
S[(sum(N[:i+1])-N[i]):sum(N[:i+1]),(sum(N[:j+1])-N[j]):sum(N[:j+1])]= \ | |
(C[i][j]*sprand(N[i],N[j],D[i][j]).todense()) | |
return S,K |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment