Skip to content

Version 5.0

Compare
Choose a tag to compare
@jimlambie jimlambie released this 08 Jan 12:38
· 302 commits to develop since this release

Added

  • #170: feature: file upload support
  • #253: feature: post processors 📬
  • #297: feature: datasource parameter extension

Changed

  • #193: fix expired token datasource failure
  • #216: construct endpoints with chained datasources
  • #288: fix required data check fails when given no data
  • #310: simplify start page: removes Dust.js and it's dependencies from the Web core. Instead there is a simple start page powered by the ES6 template engine (rather than the previous blog configuration)
  • #312: allow Mongo aggregation queries
  • fix bug where datasource parameters would be added to a DADI API endpoint when they already exist in the endpoint - leading to incorrect results loaded from the API

Datasource parameter extension

Web 5.0 modifies the way request params are obtained, making them available from the request, config or the session (if one exists). Perhaps requestParams isn't the right name for them now, but we could shift the perspective that they are for the next request rather than from the request.

Some examples

  • extract parameters from request (default)
  • extract parameters from config
  • extract parameters from session
  • extract parameters from chained datasource
  • modify endpoint from an event

Extracting parameters from different sources

 "requestParams": [
  {
    "source": "config",
    "param": "global.values.token",
    "field": "token",
    "target": "endpoint"
  },
  {
    "source": "session",
    "param": "user.profile._id",
    "field": "user",
    "target": "endpoint"
  },
  {
    "param": "title",
    "field": "title",
    "target": "filter"
  }
]

Inject params from previous datasources

In addition, chained datasources can now use data from the datasource they're attached to, modifying the endpoint if "target": "endpoint":

{
  "datasource": {
    "key": "chained-endpoint",
    "source": {
      "endpoint": "1.0/makes/{name}"
    },
    "chained": {
      "datasource": "global",   // the ds we're waiting on
      "outputParam": {
        "param": "results.0.name",  // where to find the value in the "global" datasource
        "field": "name",   // the value to replace in this endpoint URL
        "target": "endpoint"  // obvious, really
      }
    }
  }
}

Modify datasource endpoint using Events

To modify a datasource's endpoint using an event, add a property to the datasource "endpointEvent": "filename-of-event.js" and return the new endpoint in the callback:

const Event = function (req, res, data, callback) {
  const endpoint = 'v1/twitter.json'

  return callback(null, endpoint)
}

module.exports = function (req, res, data, callback) {
  return new Event(req, res, data, callback)
}

File uploads

Files uploaded from form fields will be saved to the destinationPath with details of the file added to the request's "files" property. Using this in an event, outputting req.files would give something like:

[
  {
    "fieldname": "flower",
    "originalname": "rose.png",
    "encoding": "7bit",
    "mimetype": "image/png",
    "destination": "workspace/uploads",
    "filename": "47b275c7aa7beeac411fd91ca4512f97a2bf08ea.png",
    "path": "workspace/uploads/47b275c7aa7beeac411fd91ca4512f97a2bf08ea.png",
    "size": 331660
  }
]

Configuration

  "uploads": {
    "enabled": true,
    "destinationPath": "workspace/uploads",
    "prefixWithFieldName": true,  // prefix the saved file with the name of the form field
    "prefix": "image-",  // prefix the saved file with the vale specified (overrides prefixWithFieldName)
    "hashFilename": true,  // SHA1 the original filename
    "hashKey": "abcedfg",
    "whitelistRoutes": [  // an array of routes for which file uploads will be accepted
      "/:lang/profile/edit"
    ]
  }

📬 Post Processors

Adds the ability to manipulate the raw output of the template engine, before the page is cached and served. Similar to the Wordpress feature, it is useful for linting (thinking RSS feeds in particular), minifying and formatting. They are loaded in order of definition, global first, and the output is passed from one to the next so you can chain functions.

Example usage: minifying HTML

Add globalPostProcessors to configuration file

"globalPostProcessors": [
    "minify-html"
]

The default location in paths.processors is workspace/processors

Add processor file

This processor uses the package html-minifier

/workspace/processors/minify-html.js

const minify = require('html-minifier').minify

module.exports = (data, output) => {
  return minify(output, {
    collapseWhitespace: true,
    minifyCSS: true,
    minifyJS: true,
    removeRedundantAttributes: true,
    useShortDoctype: true,
    removeAttributeQuotes: true
  })
}

Note:

  • the page json data object is also passed
  • in addition it can be defined or disabled at page level in the page.settings

Add a processor, in addition to the global

"settings": {
  "postProcessors": ["remove-swear-words"]
}

Disable for given page

"settings": {
  "postProcessors": false
}

Upgrading from page.settings.beautify

Install package

npm install js-beautify --save

New file workspace/processors/beautify-html.js

const beautifyHtml = require('js-beautify').html

module.exports = (data, output) => {
  return beautifyHtml(output)
}

Old page.json

{
  "page": {
    "name": "Sitemap page",
    "description": "Sitemap",
    "language": "en"
  },
  "settings": {
    "cache": true,
    "beautify": true
  },
  ...
}

New page.json

{
  "page": {
    "name": "Sitemap page",
    "description": "Sitemap",
    "language": "en"
  },
  "settings": {
    "cache": true,
    "postProcessors": [
      "beautify-html"
     ]
  },
  ...
}