- AWS/EC2
- Implemented methods to compress, upload, decompress and import large payloads
- SDL command line cannot upload large payloads
After analysis and experiments, selected the following AWS/EC2 instance
- AWS/EC2 family: memory optimized
- AWS EC2 instance: r4.large
- CPU: 2
- Memory: 15.5 GB
- Uploaded and processed two large payloads
- NALT: 6.84 MB
- NCIT: 13.7 MB
- No other platform will process either file
- Google SDTT - exceeds 2.5MB limit
- Gregg Kellogg hosted SDL bails out
- no other options
- SDL/AWS/EC2-NALT processed 500,000+ triples in 5 hours
- SDL/AWS/EC2-NCIT processed 900,000+ triples in 9 hours
- Must cover cost of running dedicated AWS/EC2 server
- Consider Patreon business model where authorized customers access dedicated AWS/EC2 server on-demand
- Will need to implement "authorization + authentication + access" process based on Patreon credential