DEV REST API - On the Fly!

5 years ago


I have been on more than my fair share of projects where consumable services or databases are unreliable. This often happens when these are being developed in tandem or there are networking or infrastructure issues. Besides simply killing developer workflow, this can also have an impact on your ability to hit deadlines. Hoping a long lunch will fix things inevitably fails and you can only twiddle your thumbs or work on alternative tasks for so long.

json-server to the rescue!

Fortunately, there is an excellent tool called json-server. If you haven't used it before, it allows you to set up a full fake REST API with very little effort. You simply do an npm install and create a db.json file with your API data. It supports all HTTP verbs (GET, PUT, POST, PATCH, and DELETE), has pagination capabilities, provides simply querying using a query string, and even allows you to setup relationships among other things.

Another great feature is the ability to dynamically create data using a tool like faker.js. While this is huge, it does take some additional time to set up your entities. What I find works easier is to simply copy and past my dummy data and run with this. Unfortunately json-server only supports using a single db.json file or programmatic creation of your data. Having the time to create nice entities through faker can be time consuming and trying to manage a massive db.json file can be unwieldy.

How I get around this is by creating a simple build script reads a collection of json files and concatenates them into the main db.json. As I have new data with a data schema or dummy data, I can simply add a new JSON file.


const fs = require('fs')
const argv = require('yargs').argv

const buildDb = () => {
  let db = fs
    .map(file => {
      const fileContent = fs.readFileSync(argv.apiDir + '/' + file, 'utf8')
      let fileName = file.split('.')[0]
      return '"' + fileName + '": ' + fileContent
  fs.writeFileSync(argv.dbFile, '{' + db + '}')
}, (e, file) => {
  if (file) {



Run it locally

npm run debug

Note: Keep in mind that this is an in-memory instance of the data. While you can PUT, POST, and DELETE, once you refresh your data connection, you will start over again.

Deploying Remotely

Heroku's Github integration makes this trivial as it can read directly from a Github branch. I have included a postinstall in the package.json which will install json-server and yargs, the only two dependencies. Heroku will then read from start which kicks off the serve.js.

var jsonServer = require('json-server')
var server = jsonServer.create()
var router = jsonServer.router('db.json')
var middlewares = jsonServer.defaults()
var fs = require('fs')
var port = process.env.PORT || 3000


server.listen(port, function() {
  console.log('json-server is running at port ' + port + '!')

REST Home Page

Countries REST Endpoint

Discuss on Twitter