Skip to content

Instantly share code, notes, and snippets.

@jganoff
Created August 1, 2012 19:29
Show Gist options
  • Save jganoff/3229953 to your computer and use it in GitHub Desktop.
Save jganoff/3229953 to your computer and use it in GitHub Desktop.
Pentaho Big Data Plugin: Hadoop Configuration Testing with ShrinkWrap and VFS
// Create a test hadoop configuration "a"
FileObject ramRoot = VFS.getManager().resolveFile(HADOOP_CONFIGURATIONS_PATH);
FileObject aConfigFolder = ramRoot.resolveFile("hadoop-configurations/a");
aConfigFolder.createFolder();
assertEquals(FileType.FOLDER, aConfigFolder.getType());
// Create the properties file for the configuration as hadoop-configurations/a/config.properties
FileObject configFile = aConfigFolder.resolveFile("config.properties");
Properties p = new Properties();
p.setProperty("name", "Test Configuration A");
p.setProperty("classpath", "");
p.setProperty("library.path", "");
p.store(configFile.getContent().getOutputStream(), "Test Configuration A");
// Create the implementation jar
FileObject implJar = aConfigFolder.resolveFile("a-config.jar");
implJar.createFile();
// Use ShrinkWrap to create the jar and write it out to VFS
JavaArchive archive = ShrinkWrap.create(JavaArchive.class, "a-configuration.jar").addAsServiceProvider(
HadoopShim.class, MockHadoopShim.class)
.addClass(MockHadoopShim.class);
archive.as(ZipExporter.class).exportTo(implJar.getContent().getOutputStream());
@ALRubinger
Copy link

And that's why we have exportTo(OutputStream);

:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment