Skip to content

Instantly share code, notes, and snippets.

@robince
Last active December 14, 2015 21:29
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save robince/5151962 to your computer and use it in GitHub Desktop.
Save robince/5151962 to your computer and use it in GitHub Desktop.
setup_cluster_sharedmatrix.m
% Start your pool!
% matlabpool open
%%
% this cell gets details about the pool to find a single worker on
% each physical machine
% get all hosts and workers
spmd
t = getCurrentWorker();
ids = labindex;
hosts = get(t,'Host');
workers = get(t,'Name');
end
% convert from spmd objects
ids = {ids{:}};
hosts = {hosts{:}};
workers = {workers{:}};
uniquehosts = unique(hosts);
uniqueworkersidx = zeros(1,length(uniquehosts));
for hi=1:length(uniquehosts);
% idx all workers on this host
idx = cellfun(@(x) strcmpi(x, uniquehosts{hi}), hosts);
% just take the first one
uniqueworkersidx(hi) = min(find(idx));
end
% setup shared memory on each host
% using only one worker per host
uniqueids = [ids{uniqueworkersidx}];
% uniqueids now contains the id of one worker per physical machine
%%
% set this a unique 5 digit key to identify the shared memory for your
% project
shmkey = 11112;
spmd
% add path to all workers
% this should be a shared drive or somewhere that all workers have access to
% which contains the sharedmatrix package (if not already setup on each worker)
% and your customised clonesharedmemory() function for loading your data set
addpath('/analyse/pilots/robini/code')
addpath('/analyse/pilots/robini/code/sharedmatrix')
% only clone the shared memory on one worker per physical machine
if ismember(labindex, uniqueids)
% clonesharedmemory needs to be a function you write to load
% your data as appropriate
% have to write it as a function to avoid having data in the spmd
% scope which will be transmitted back to the client machine
clonesharedmemory(shmkey)
fprintf(1,'Id %i OK', labindex)
end
end
%%
% now you can use your shared memory in all your workers
% things go a bit wonky if you don't detach and free properly
% so try to only attach to the big data at the start, extract the slice
% or data you need for that iteration and then detach ASAP
% Don't add the shared stuff until there are no other errors!
xavg = zeros(1,1000);
parfor i=1:1000
x = sharedmatrix('attach', shmkey);
xavg(i) = mean(x(i,:));
sharedmatrix('detach', shmkey, x);
end
%%
% it might be a good idea to put a sharedmatrix option so you can run
% it with or without for testing... something like this (to give an idea
% - not tested)
%doshared = false;
%parfor i=1:1000
%if doshared
%x = sharedmatrix('attach',shmkey);
%else
%x = plain_data_load();
%end
%xavg(i) = mean(x(i,:));
%if doshared
%sharedmatrix('detach', shmkey, x);
%end
%end
%%
% remember to free it at the end
% this will give an error if it is already freed
spmd
if ismember(labindex, uniqueids)
sharedmatrix('free',shmkey);
end
end
%%
% these frees on all workers just to be sure and catches any errors
spmd
try sharedmatrix('free',shmkey);catch,end;
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment