Skip to content

Instantly share code, notes, and snippets.

@royra
Created November 21, 2019 11:10
Show Gist options
  • Save royra/523602cc12cb8e1bee8355138a7e7503 to your computer and use it in GitHub Desktop.
Save royra/523602cc12cb8e1bee8355138a7e7503 to your computer and use it in GitHub Desktop.
Dump and read lots of JSONs in node
const fs = require('fs')
o = {
ACCTOUNT_NUMBER: '1234567890',
CUSTOMER_NAME: 'ACME Products and Services, Inc.',
ADDRESS: '123 Main Street',
CITY: 'Albuquerque',
STATE: 'NM',
ZIP: '87101-1234'
}
const fileContents = []
for (let i=0; i<200; ++i) fileContents.push(o)
const files = []
for (let i=0; i<3000; ++i) {
files.push(`/tmp/a_${i}`)
}
const timeIt = async f => {
const startTime = new Date()
await f()
console.log(`elapsed time: ${(new Date() - startTime)/1000}s`)
}
const main = async () => {
console.log(`dumping and writing ${files.length} JSONs of size ${Math.round(JSON.stringify(fileContents).length/1024)}KB each`)
await timeIt(() => Promise.all(files.map(file => (() =>
fs.promises.writeFile(file, JSON.stringify(fileContents))
)())))
console.log(`reading all the files`)
await timeIt(() => Promise.all(files.map(file => (async () => {
JSON.parse(await fs.promises.readFile(file))
})())))
}
main().catch(err => {
console.error(err)
process.exit(2)
})
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment