Releases: dadi/web
Version 7.0.3
Fixed
- #499: add CORS headers to response from cache
Version 7.0.1
Fixed
- #493: Allow duplicate nodes in public path
Version 7.0.0
Domain redirects
Web 7.0 modifies the way domain redirects are configured. The existing functionality is to supply only a hostname to redirect to, as a String property. This is a breaking change in Web 7.0, and requires configuring forceDomain
to be an object containing hostname, port and redirect type (now also a number
rather than a string
).
The default is an empty hostname, effectively disabling the functionality.
"default": {
"hostname": "",
"port": 80,
"type": 301
}
Example
To redirect to example.com
from www.example.com
temporarily you could configure it thus:
"rewrites": {
"forceDomain": {
"hostname": "example.com",
"type": 307
}
}
Simpler datasource specifications
Web 7.0 allows flattened datasource specification files, to match the page specification files. Notice in the example below there is no top-level datasource
property as specified here. No changes to existing projects are required, this change is backwards compatible with previous versions.
{
"key": "cars",
"name": "Cars datasource",
"source": {},
"count": 20,
"search": {},
"filter": {}
}
Simpler page specifications
Web 7.0 allows page specification files without the "page" block, using the filename as the page name internally.
Both the following examples will work for index.json
{
"page": {
"name": "index"
},
"routes": [
{
"path": "/"
}
]
}
{
"routes": [
{
"path": "/"
}
]
}
Add "composed" option to datasources
The value of the composed
property will be added to the API endpoint querystring.
{
"datasource": {
"key": "cars",
"name": "Cars datasource",
"source": {
"endpoint": "/1.0/db/cars"
},
"compose": "all",
"count": 20
}
}
The generated endpoint will become /1.0/db/cars?count=20&compose=all
HTTP2
Start the server with support for HTTP2 by changing the protocol used in the server
configuration:
"server": {
"host": "example.com",
"port": 443,
"protocol": "http2",
"sslPrivateKeyPath": "keys/server.key",
"sslCertificatePath": "keys/server.crt",
}
This basic level of support gives Web most of the HTTP2 benefits such as multiplexing, binary transfers and header compression, and is backwards compatible with clients that do not support it 🎉
Fixed
- #263: allow composed option in datasource specification files
- #438: simplify page specification files
- #469: allow adding/modifying/deleting page.json files while the application is running
- #479: allow flattened datasource specification files
- #489: fix an issue whereby the querystring parameters of a URL were not taken into account when constructing a cache key
Version 6.1.1
Changed
-
Add property to markdown datasource to specify whether the HTML should be rendered and added to the output. To disable HTML, simply extend the
source
block with arenderHtml
property:"source": { "type": "markdown", "path": "./docs", "renderHtml": false }
Version 6.1.0
Added
- #405: Multi-language support, see documentation https://docs.dadi.cloud/web/6.1#internationalization
Changed
Version 6.0.1
Changed
- #399: improve the process of selecting a loaded endpoint based on the request URL within the cache layer
Version 6.0.0
New features
Debug view
A new debug view has been added to allow developers to understand how a particular page has been generated. In previous versions of Web we've used a configuration setting allowJsonView
which enabled the option of seeing the data context used to build a rendered page - this could be used on any page by appending json=true
to the URL.
The new debug view in this version is far more powerful and provides greater insight into how your data is being used within Web.
To enable debug view, we've provided a new configuration setting allowDebugView
. This replaces the existing allowJsonView
setting, however it is backwards-compatible with the ?json=true
parameter: this still works but is translated to?debug=json
.
Documentation is available on the DADI Documentation website, but here are a few of the things the debug view can show:
- the DADI Web version and Node.js version
- the data Web used to the construct the page, such as the template name and attached datasources and events
- the rendered template next to the output with
postProcessors
applied - the list of routes and the current page
pathname
- the
request
andresponse
headers - the raw page data JSON
- time to render output
- size of raw data payload
- which page route matched the current path
- size of rendered output (before compression)
Multiple API support
Version 6.0 removes Web's 1:1 relationship with DADI API. Previously Web allowed for a single API connection via the main configuration file:
"api": {
"host": "127.0.0.1",
"port": 3000
}
"auth": {
"tokenUrl": "/token",
"clientId": "your-client-id",
"secret": "your-secret"
}
With this approach if you wanted to connect a datasource to a different DADI API then those details were required in the datasource specification, making the maintenance of configuration somewhat difficult with keys potentially spread throughout the app.
Version 6.0 allows adding multiple DADI API configurations which can be referenced from a datasource by API name:
Main configuration
"api": {
"main": {
"host": "api-one.somedomain.com",
"port": 80,
"auth": {
"tokenUrl": "/token",
"clientId": "your-client-id",
"secret": "your-secret"
}
},
"secondary": {
"host": "api-two.somedomain.com",
"port": 80,
"auth": {
"tokenUrl": "/token",
"clientId": "your-client-id",
"secret": "your-secret"
}
}
}
Datasource configuration
{
"datasource": {
"key": "articles",
"source": {
"api": "main",
"endpoint": "1.0/library/articles"
},
"count": 12,
"paginate": false
}
}
REST API provider
Version 6.0 removes the wordpress
and twitter
data providers, replacing them with a restapi
provider. Details are available here: https://docs.dadi.tech/web/#rest-api. The main difference between the existing remote
provider and the new restapi
provider is that restapi
provider can be supplied with authentication configuration.
Changed
- #258: give
globalEvents
access to full page data, and run per request instead of only at startup - #262: default workspace folders no longer created unnecessarily, even if the config
paths
specified different locations than the default - #267: fix Markdown provider crash if data was malformed
- #336: middleware can now intercept public folder requests
- #350: add support for range header requests on static assets so a browser can specify which part of a file it wants to receieve and when.
- #370: add configuration options for @dadi/status
- Deprecated configuration setting for compression removed.
headers.useGzipCompression
was deprecated in Web 4.0 in favour of the more genericuseCompression
.
Page whitespace configuration
Note: this is a breaking change
This configuration option has been moved within the page specification file to a block dedicated to the chosen template engine. Modify existing page.json
files as follows:
Existing
"settings": {
"keepWhitespace": true
}
New
"settings": {
"engine": {
"keepWhitespace": true
}
}
Template engine developers can also use engine
to pass any page specific setting to their engine.
Dependency updates
- upgrade to Brotli 1.0 compression engine
- upgrade router to use version 2.0 of path-to-regexp
Version 4.0.3
This release contains a back port of the changes made in Version 5.0 that allowed multiple replacements to be made in a datasource's endpoint, based on request parameters. The associated pull request and issue can be seen here.
Version 5.0
Added
- #170: feature: file upload support
- #253: feature: post processors 📬
- #297: feature: datasource parameter extension
Changed
- #193: fix expired token datasource failure
- #216: construct endpoints with chained datasources
- #288: fix required data check fails when given no data
- #310: simplify start page: removes Dust.js and it's dependencies from the Web core. Instead there is a simple start page powered by the ES6 template engine (rather than the previous blog configuration)
- #312: allow Mongo aggregation queries
- fix bug where datasource parameters would be added to a DADI API endpoint when they already exist in the endpoint - leading to incorrect results loaded from the API
Datasource parameter extension
Web 5.0 modifies the way request params are obtained, making them available from the request, config or the session (if one exists). Perhaps requestParams isn't the right name for them now, but we could shift the perspective that they are for the next request rather than from the request.
Some examples
- extract parameters from request (default)
- extract parameters from config
- extract parameters from session
- extract parameters from chained datasource
- modify endpoint from an event
Extracting parameters from different sources
"requestParams": [
{
"source": "config",
"param": "global.values.token",
"field": "token",
"target": "endpoint"
},
{
"source": "session",
"param": "user.profile._id",
"field": "user",
"target": "endpoint"
},
{
"param": "title",
"field": "title",
"target": "filter"
}
]
Inject params from previous datasources
In addition, chained datasources can now use data from the datasource they're attached to, modifying the endpoint if "target": "endpoint"
:
{
"datasource": {
"key": "chained-endpoint",
"source": {
"endpoint": "1.0/makes/{name}"
},
"chained": {
"datasource": "global", // the ds we're waiting on
"outputParam": {
"param": "results.0.name", // where to find the value in the "global" datasource
"field": "name", // the value to replace in this endpoint URL
"target": "endpoint" // obvious, really
}
}
}
}
Modify datasource endpoint using Events
To modify a datasource's endpoint using an event, add a property to the datasource "endpointEvent": "filename-of-event.js"
and return the new endpoint in the callback:
const Event = function (req, res, data, callback) {
const endpoint = 'v1/twitter.json'
return callback(null, endpoint)
}
module.exports = function (req, res, data, callback) {
return new Event(req, res, data, callback)
}
File uploads
Files uploaded from form fields will be saved to the destinationPath
with details of the file added to the request's "files" property. Using this in an event, outputting req.files
would give something like:
[
{
"fieldname": "flower",
"originalname": "rose.png",
"encoding": "7bit",
"mimetype": "image/png",
"destination": "workspace/uploads",
"filename": "47b275c7aa7beeac411fd91ca4512f97a2bf08ea.png",
"path": "workspace/uploads/47b275c7aa7beeac411fd91ca4512f97a2bf08ea.png",
"size": 331660
}
]
Configuration
"uploads": {
"enabled": true,
"destinationPath": "workspace/uploads",
"prefixWithFieldName": true, // prefix the saved file with the name of the form field
"prefix": "image-", // prefix the saved file with the vale specified (overrides prefixWithFieldName)
"hashFilename": true, // SHA1 the original filename
"hashKey": "abcedfg",
"whitelistRoutes": [ // an array of routes for which file uploads will be accepted
"/:lang/profile/edit"
]
}
📬 Post Processors
Adds the ability to manipulate the raw output of the template engine, before the page is cached and served. Similar to the Wordpress feature, it is useful for linting (thinking RSS feeds in particular), minifying and formatting. They are loaded in order of definition, global first, and the output is passed from one to the next so you can chain functions.
Example usage: minifying HTML
Add globalPostProcessors to configuration file
"globalPostProcessors": [
"minify-html"
]
The default location in
paths.processors
isworkspace/processors
Add processor file
This processor uses the package html-minifier
/workspace/processors/minify-html.js
const minify = require('html-minifier').minify
module.exports = (data, output) => {
return minify(output, {
collapseWhitespace: true,
minifyCSS: true,
minifyJS: true,
removeRedundantAttributes: true,
useShortDoctype: true,
removeAttributeQuotes: true
})
}
Note:
- the page json data object is also passed
- in addition it can be defined or disabled at page level in the
page.settings
Add a processor, in addition to the global
"settings": {
"postProcessors": ["remove-swear-words"]
}
Disable for given page
"settings": {
"postProcessors": false
}
Upgrading from page.settings.beautify
Install package
npm install js-beautify --save
New file workspace/processors/beautify-html.js
const beautifyHtml = require('js-beautify').html
module.exports = (data, output) => {
return beautifyHtml(output)
}
Old page.json
{
"page": {
"name": "Sitemap page",
"description": "Sitemap",
"language": "en"
},
"settings": {
"cache": true,
"beautify": true
},
...
}
New page.json
{
"page": {
"name": "Sitemap page",
"description": "Sitemap",
"language": "en"
},
"settings": {
"cache": true,
"postProcessors": [
"beautify-html"
]
},
...
}
Version 3.1.3
Changed
- #288: fix an issue where an event returing a 404 error would cause a failure if the current page had requiredDatasources set - no data object is passed