For medium (not big) data with RAM issues, for native R solutions, I often use {fst} + {data.table}. Keys are
- load only needed data in RAM
- during data wrangling, whenever possible, do NOT shallow/deep copy objects. Use 'reference semantics' e.g. {data.table} to modify in-place
Quote from Lightning fast serialization of data frames using the fst package
For a few years now, solid state disks (SSD’s) have been getting larger in capacity, faster and much cheaper. It’s not uncommon to find a high-performance SSD with speeds of up to multiple GB/s in a medium-end laptop. At the same time, the number of cores per CPU keeps growing. The combination of these two trends are opening up the way for data science to work on large data sets using a very modest computer setup.