Thanks Peter. My recommendation was to pass that variable inline to the call which wouldnt set an ENV variable as far as I am aware.

I do like your approach better however as the variable is set for the entire build.
You will need to pass DEBIAN_FRONTEND=noninteractive to apt-get:

RUN apt-get update \
    && apt-get upgrade -y \
    && DEBIAN_FRONTEND=noninteractive apt-get install -y \
    apache2 \
    libapache2-mod-php \
    php7.0 \
    php7.0-mysql \
    && rm -rf /var/lib/apt/lists/*
I would like to better understand how propagating anOAuth2 access token would work.

In my experience, propagating the users token to downstream services doesn't work for one major reason: the service invoking the downstream service often needs privileged access to data that the end user would never have permission to view him/herself.

Imaging a credit card service that needs to pull the users credit score as well as invoke potentially sensitive and proprietary rules to determine if a user is eligible for a company's "platinum card".

For this reason, services themselves need to have permissions granted to them that are different from the user; these services use their own credentials when invoking downstream services.


user---(user token)--->service A---(service token)--->service B

Service A would assume the responsibility for returning only the information the user should see, so the filtering of sensitive data happens there after it has been used for processing. This makes sense as Service A is the service responsible for returning information to the user.

My question is: am i reading this wrong? And if not, how does the author account for the complexities I have described?

Thanks Peter. This makes sense and I look forward to the additional content!
I love Manning but the first edition of this book is only a little over a year old. Many people including myself needed to modify the code that was distributed with the book because it doesnt work and there seems to be no help offered through the forums.

I just bought the book two weeks ago.

As the content of this edition seems identical to the first i wonder what the motivation is to create a whole new edition rather than simply update the original books code samples.
This might also help as other modifications needed to be made to other files. Here is the output from my UserProfileHandler.js class. I am not a Node.js developer so please forgive the errors that I am sure exist in the file. I uploaded my public key cert from Auth0 to an s3 bucket and used that to verify incoming tokens:

'use strict';

var AWS = require('aws-sdk');
var jwt = require('jsonwebtoken');
var request = require('request');
var s3 = new AWS.S3();

var verifyToken = function (token, secretOrPublicKey, verifyOptions, callback) {

    jwt.verify(token, secretOrPublicKey, verifyOptions, function (err, decoded) {

        if (err) {
            console.log('Failed jwt validation: ', err, 'auth: ', token);
            callback('Authorization Failed');
        } else {
            var headers = {
                'Authorization': "Bearer " + token
            var options = {
                url: 'https://' + process.env.DOMAIN + '/userinfo',
                method: 'GET',
                json: true,
                headers: headers
            request(options, function (error, response, body) {
                if (!error && response.statusCode === 200) {
                    callback(null, body);
                } else {
                    console.log("tokeninfoerror: ", error);
                    console.log("response.statusCode: ", response.statusCode);

exports.handler = function (event, context, callback) {

    if (!event.authToken) {
        callback('Could not find authToken');
    var token = event.authToken.split(' ')[1]; // contains the word Bearer before the token

    if (!process.env.CLIENT_SECRET) {

        console.log("grabbing public key from s3");

        var params = {
            Bucket: process.env.PUBLIC_KEY_BUCKET_NAME,
            Key: process.env.PUBLIC_KEY_BUCKET_KEY
        s3.getObject(params, function (s3err, data) {

            if (s3err) {
                console.log(s3err, s3err.stack);

            } else {
                verifyToken(token, new Buffer(data.Body, 'binary'), { algorithm: 'RS256' }, callback);

    } else {
        console.log("using client secret");
        verifyToken(token, process.env.CLIENT_SECRET, { algorithm: 'HS256' }, callback);

This is a show stopper for me. Can the author please supply updated source code??

I appreciate the previous user posting a partial solution but i cannot access his file.

Edit: I think i solved most of my issues. Here is the output of user-controller.js:

var userController = {
    data: {
        auth0Lock: null,
        config: null
    uiElements: {
        loginButton: null,
        logoutButton: null,
        profileButton: null,
        profileNameLabel: null,
        profileImage: null
    init: function (config) {

        var that = this;

        this.uiElements.loginButton = $('#auth0-login');
        this.uiElements.logoutButton = $('#auth0-logout');
        this.uiElements.profileButton = $('#user-profile');
        this.uiElements.profileNameLabel = $('#profilename');
        this.uiElements.profileImage = $('#profilepicture'); = config; = new Auth0Lock(config.auth0.clientId, config.auth0.domain, {
            auth: {
				responseType: 'id_token token',
                params: {
                    scope: config.auth0.scope,
					audience: config.auth0.audience,
					redirectUrl: "",
					responseType: "token"
        }); // params set in config.js"authenticated", function (authResult) {
            console.log("authenticated: ", authResult);
            localStorage.setItem('userToken', authResult.accessToken);

        var idToken = localStorage.getItem('userToken');

        if (idToken) {
    retrieveProfileData: function (accessToken) {
		console.log("retrieveProfileData: ", accessToken);
        var that = this;

        console.log("retrieving profile");
        this.configureAuthenticatedRequests();, function (err, profile) {
            if (err) {
                return alert('There was an error getting the profile: ' + err.message);
    configureAuthenticatedRequests: function () {
            'beforeSend': function (xhr) {
				var token = localStorage.getItem('userToken');
				console.log("sending request with token: " + token);
                xhr.setRequestHeader('Authorization', 'Bearer ' + token);
    showUserAuthenticationDetails: function (profile) {

		console.log("showUserAuthenticationDetails: ", profile);
        var showAuthenticationElements = !!profile; //coerce into a boolean (!!1 evalutes to true, !!0 evalutes to false)

        if (showAuthenticationElements) {
            this.uiElements.profileImage.attr('src', profile.picture);
    wireEvents: function () {

        var that = this; (e) {
        }); (e) {


        }); (e) {

            var url = + '/user-profile';

            $.get(url, function (data, status) {
                $('#user-profile-raw-json').text(JSON.stringify(data, null, 2));
Thanks for the help! I was stuck on this for hours.
Sorry the proper error was:

java.lang.IllegalStateException: You can't query operators <, <=, ==, !=, >=, > on multi-value properties
The SQL statement from the 5.3.2 NULL Predicate section:

SELECT * FROM cmisbook:note WHERE cmisbook:noteLinks <> 'resource.txt'

results in a runtime null exception

Is this because cmisbook:noteLinks is a multi-valued property?
Indeed... this was in the console window of the server:

java.lang.IllegalStateException: Operator IN only is allowed on single-value properties
The following statement doesnt refer to a consistent attribute name and so is confusing to the reader:

"All subtypes of
cmis:document in a repository must have the versionable attribute that was
introduced at the cmis:document level. However the specific boolean value of
versionable for each of those sub types is set independently. So in a sample
repository cmis:document might be versionable=true, and still have a subtype
named invoiceDocument that had fileable=false."

Perhaps it read instead:

"All subtypes of
cmis:document in a repository must have the versionable attribute that was
introduced at the cmis:document level. However the specific boolean value of
versionable for each of those sub types is set independently. So in a sample
repository cmis:document might be versionable=FALSE, and still have a subtype
named invoiceDocument that had versionable=TRUE."
The FileInputStream constructor is passed a variable that does not exist in the script; it should be passed the variable 'f' not the variable 'file':

def f = new File('/users/jpotts/Desktop/potts_contract.docx')
def name = f.getName()
def mimetype = someDoc.contentStreamMimeType
def contentStream = new ContentStreamImpl(name,
new FileInputStream(file)) // "file" is not declared... this should be "f"
Listing 3.8 is missing the line:


Without it the document property cmis:isVersionSeriesCheckedOut resolves to false despite the fact that the repository has actually checked the file out.
Examples which use ContentStreamImpl in the book do not include its import statement (e.g. section 3.9):

import org.apache.chemistry.opencmis.commons.impl.dataobjects.*
Same is true of exceptions classes:

import org.apache.chemistry.opencmis.commons.exceptions.*
Attempting to create a versioned document per the tutorial in 3.2.4 but I am running into a problem. It seems that the VersionableType id is invalid. Was this a typo or is there an outstanding bug in Chemistry?

org.apache.chemistry.opencmis.commons.exceptions.CmisObjectNotFoundException: unknown type id: VersionableType
In case others are wondering how to navigate around this error I simply substituted "cmisbook:pdf" for "VersionableType" per the following line of code:

def someDoc = cmis.createDocumentFromFile(someFolder, f, "cmisbook:pdf", VersioningState.MAJOR)
Fantastic book BTW. It's a joy to read. Its an absolutely fantastic idea to bundle the server and workbench code together in the way that you did and I hope other authors are inspired by your example. is magnificently simple.

Message was edited by:
I am not on my work machine but some of the issues I encountered were related to the ant targets not existing in 5.11 (specifically the stand alone database). Running the BookOrderTest for example results in a JdbcSQLException being thrown unless you modify it to use InMemProcessEngineConfiguration.

I also had problems running the Activiti Modeler which seems to reference a workspace directory I can only presume existed in previous incarnations of Activiti but now is no longer present.

Certain classes in the example code seems not to compile against the 5.11 libraries. In the bpmn-examples code neither org.bpmnwithactiviti.chapter10.ldap.LDAPGroupManager nor org.bpmnwithactiviti.chapter10.ldap.LDAPUserManager work under 5.11 but compile fine under 5.9.

I have no option using the latest Designer plugin to "Create deployment artifacts" within Eclipse; at this point in the book I have no idea how to deploy my processes into a container. I see no maven target that would allow me to create the artifacts either (which seems counterintuitive).

Lastly, attempting to convert a user task to a script task in the Designer tool results in a nullpointer being thrown. Though this is surely a bug in Activiti itself, I do wonder how does this kind of thing get into the release ... I would presume the Activiti team has unit tests to protect against this kind of regression. It makes me question how suitable Activiti would be for use in an enterprise (where such GUI tools would see frequent use and even a small regression such as this could cost a lot of money and raise eyebrows).

Thanks for the prompt reply.

Message was edited by:
this comment was largely related to the changes made in Activiti 5.11 that broke many of the directions/examples in the book. I am attempting to downgrade Activiti to resolve my problems...

Message was edited by:
The repo seems to be dead and I cannot get the application to compile as a result:

[ERROR] Failed to execute goal on project scatours-contribution-buildingblocks: Could not resolve dependencies for project Failed to collect dependencies for [ (compile), (runtime), (runtime), (runtime), (provided), (test), (test), junit:junit:jar:4.5 (test)]: Failed to read artifact descriptor for opensaml:opensaml:jar:1.1: Could not transfer artifact opensaml:opensaml:pom:1.1 from/to ( No connector available to access repository ( of type legacy using the available factories WagonRepositoryConnectorFactory -> [Help 1]

Any help would be appreciated.