Connecting Salesforce to a Heroku Database

A popular use of Salesforce is as a front end system of engagement application, using a lookup integration to the more static system of record data in a back-end such as SAP. I wanted to set up a demo to show this but I didn’t have access to an SAP environment so I decided to simulate the back-end by creating a Postgres database in Heroku.

I decided to use some publicly available open data on properties as the database and for this to be looked up dynamically from the account record in Salesforce.

Here are the steps I went through.

First get a Heroku account. Go to heroku.com and ‘sign up for free’

image001

Then login.
On my computer I installed node.js from https://nodejs.org/

And npm from https://github.com/npm/npm

And the Heroku toolbelt from https://devcenter.heroku.com/articles/getting-started-with-nodejs#set-up

Once installed, you can use the heroku command from your command shell. Log in using the email address and password you used when creating your Heroku account:

image003

Execute the following commands to clone the sample application:

image004

You now have a functioning git repository that contains a simple application as well as apackage.json file, which is used by Node’s dependency manager.

image005

Now we create an app on Heroku, which prepares Heroku to receive the source code.

image006

When you create an app, a git remote (called heroku) is also created and associated with your local git repository.

Heroku generates a random name (in this case glacial-sierra-8855) for your app.

The package.json file determines both the version of Node.js that will be used to run your application on Heroku, as well as the dependencies that should be installed with your application. When an app is deployed, Heroku reads this file and installs the appropriate node version together with the dependencies using the npm install command.

Run this command in your local directory to install the dependencies, preparing your system for running the app locally:

image007

Now we add a free Heroku Postgres Starter Tier dev database to the app.

image008

Now we need to add Postgres to the path:

image009

I found some sample data of properties here.

Download the CSV file to your current directory and delete first line so there’s only one header.

Now add an index column; so in Excel insert a new column, put 1 and 2 as the first two entries and then highlight these and drag the mouse down so that it auto populates the other records with increasing indices:

image010

image012

Now we create the database table in the same format as the spreadsheet.

image013

Run a select statement to check it worked:

image014

Set the date format to be UK format like the spreadsheet.

image015

Then copy the spreadsheet into the database:

image016

So now we have an app, a database and a table. In order to access it from Salesforce we need to request that the database be enabled as an external object by raising a ticket at https://help.heroku.com/

You will need to provide Support with the application name (in this case glacial-sierra-8855).

Once external object support has been enabled go to https://connect.heroku.com/

and set the app as a Heroku Connect Instance and get the username, password and url for it.

Select the ’emptyproperties’ data source to share:

image017

Now, to connect to this from Salesforce you need a Salesforce org with Lightning Connect/OData enabled. If you request a developer environment from here then you will get this functionality automatically https://developer.salesforce.com/signup

In your Salesforce org:

  • Click Setup (upper right corner)
  • Click Develop > External Data Sources (left navigation)
  • Click New External Data Source
  • Enter OrderDB as the Label. As you click or tab away from the label field, the Name field should automatically default to OrderDB.
  • Select Lightning Connect: OData 2.0 as the Type.

(OrderDB doesn’t have to be the name, choose something meaningful for you).

Enter the url, username and password from heroku

image019

  • click ‘validate and sync’
  • and select the ‘emptyproperties’ table
  • and select ‘sync’

image021

Then click into the ’emptyproperties’ external object

image023

You should be able to see that all the fields have been picked up from the database:

image025

You can now create a Custom Tab to Easily Access properties

  • Click Setup (upper right corner)
  • Click Create > Tabs
  • Click the New button next to Custom Object Tabs.
  • Select properties as the Object.
  • Click the selector next to Tab Style and choose whichever style you like.
  • Click Next.
  • Click Next to accept the default tab visibility settings.
  • Choose the apps that you want the tab to be included in.
  • Click Save.

image027

Now there will be a new tab:

image029

Click on ‘go’ to view all. All the records are now accessible in the database by clicking through the external ids.

image031

Now go back to ‘external objects’ where you were just before creating the tab.

Now we want to make the index an external lookup. Click ‘edit’ next to index.

image033

Select “change field type”:

image035

  • Select External Lookup Relationship and click Next. An external lookup relationship can link any object to an external object.
  • Select emptyproperties as the value of Related To and click Next.

image037

  • Enter4 as the value for Length and click Next.
  • Enable theVisible checkbox to make the relationship visible to all profiles, and click Next.
  • Click Save to accept the defaults – you definitely want an ‘OrderDetails’ related list on the Orders page layout!

Now go to the properties tab and select an external id and the full property detail is displayed:

image039

[Edit: if the property details don’t show up the you will need to go to the user’s Profile and enable READ in the FLS of your External Data Object fields.]

Now let’s assign properties to accounts. Se we’ll edit the account record and add a property

  • Setup, customize, accounts, fields
  • New custom field
  • External lookup
  • Select the properties but change the field label just to Property
  • Step through and save it
  • Now when we go to an account we see an empty field for Property

image041

If we edit the field and put an index in it becomes a link to the Heroku properties database

image043

Now the property also shows a link back to the account from the property:

image045

And that’s it. We now have accounts in the Salesforce CRM system with real-time lookups to the system of record in a Heroku database.

Using a Cloudant database with a BlueMix application

I wanted to learn how to use the Cloudant database with a BlueMix application. I found this great blog post Build a simple word game app using Cloudant on Bluemix by Mattias Mohlin. I’ve been working through it.

image001

I’ve learned a lot from it – as the writer says “I’ll cover aspects that are important when developing larger applications, such as setting up a good development environment to enable local debugging. My goal is to walk you through the development of a small Bluemix application using an approach that is also applicable to development of large Bluemix applications.” So it includes developing on a PC and also setting up Cloudant outside of BlueMix.

So here’s my simplified version focusing purely on getting an application up and running using a Cloudant BlueMix service and staying in DevOps Services as much as possible.

The first step is to take a copy of Mattias’s code so go to the GuessTheWord DevOps Services project.

click on “Edit Code” and then “Fork”

image003

I chose to use the same project name GuessTheWord – in DevOps Services it will be unique as it’s in my project space.

image005

This takes me into my own copy of the project so I can start editing it.

I need to update the host in the manifest file otherwise the deployment will conflict with Mattias’s. So in my case I change it to GuessTheWordGarforth but you’ll need to change it to something else otherwise yours will clash with mine. Don’t forget to save the file with Ctrl-S, or File/Save or at least changing file.

image007

Now I need to set up the app and bind the database on BlueMix so I click on “deploy”. I know it won’t run but it will start to set things up.

At this point I logged onto BlueMix itself for the first time and located the new GuessTheWord in the dashboard.

image009

I clicked on it and selected “add a service” and then scrolled down to the Cloudant NoSQL DB

image011image013

and click on it. I clicked on “create” and then allowed it to restart the application. Unsurprisingly it still did not start as there is more coding to do. However the Cloudant service is there so I clicked on “Show Credentials” and saw that the database has  username, password, url etc so the registration etc on the Cloudant site is not necessary as this is all handled by BlueMix.

image015image017Clicking on Runtime on the left and then scrolling down to Environment variables I can see that these Cloudant credentials have been set up as VCAP_SERVICES environment variables for my app. So I just need to change the code to use these.

I switch back to DevOps Services and go to the server.js file to modify the code for accessing this database.

I change line 27 from
Cloudant = env[‘user-provided’][0].credentials;
to
Cloudant = env[‘CloudantNoSQLDB’][0].credentials;

So we’re providing the high level environment variable not the name or the label.

Unfortunately there is also an error in Mattias’s code. I don’t know whether the BlueMix Cloudant service has changed since he wrote it but he builds the url for the database by adding the userid and password to it but actually these are already in my environment variable url

so I change line 30 from

var nano = require(‘nano’)(‘https://’ + Cloudant.username + ‘:’ + Cloudant.password + ‘@’ + Cloudant.url.substring(8));
to simply
var nano = require(‘nano’)(Cloudant.url);

Now save the file and click deploy. When it’s finished a message pops up saying see manual deployment information in the root folder page.

image019

So I click on that and hopefully see a green traffic light in the middle.

image021

Click on the GuessTheWord hyperlink and should take you to the working game which in my case is running at

http://guessthewordgarforth.mybluemix.net/

image023

However there are still no scores displayed as there is no database table or entries.

I spent a long time trying to do this next part in the code but eventually ran out of time and had to go through the Cloudant website. If anyone can show me how to do this part in code I’d really appreciate it.

So for now, go to the GuessTheWord app on BlueMix and click on the running Cloudant service

image025

From here you get to a Launch button

image027

Pressing this logs you on to the Cloudant site using single sign on

image029

Create a new database named guess_the_word_hiscores. Then click the button to create a new secondary index. Store it in a document named top_scores and name the index top_scores_index. As Mattias says, the map function defines which objects in the database are categorised by the index and what information we want to retrieve for those objects. We use the score as the index key (the first argument to emit), then emit an object containing the score, the name of the player, and the date the score was achieved. Following is the JavaScript implementation of the map function, which we need to add before saving and building the index.

function(doc) {
emit(doc.score, {score : doc.score, name : doc.name, date : doc.date});
}

image031

Again, we should really be able to do the following as part of the program startup but anyway, the following should add an entry to the database, replacing guessthewordgarforth in the URL with the host name you chose for your application:

http://guessthewordgarforth.mybluemix.net/save_score?name=Bob&score=4

You should see a success message. Enter the following URL, again replacing guessthewordgarforth with your application host name.

http://guessthewordgarforth.mybluemix.net/hiscores

The entry you just added should appear encoded in JSON e.g.

[{“score”:4,”name”:”Bob”,”date”:”2014-08-07T14:27:34.553Z”}]

So, the code and the database are working correctly. Now it just remains to play the game. Go to

http://guessthewordgarforth.mybluemix.net

(replacing guessthewordgarforth with your hostname)

This time it will include Bob in the high score table

image033

and click on “Play!”

game

Cloud computing trends in the UK: IaaS, PaaS & SaaS

This post was originally published on ThoughtsOnCloud on June 17th, 2014.

I’ve been a cloud architect foEnglish: Flats on Deansgate with cloud. Manche...r the last three years or so and have seen dramatic changes in the IT industry and its view of cloud. I’ve also observed different attitudes to cloud in different industries and countries.

I took on the cloud architect role because I saw that customers were asking about cloud, but they all had different ideas of what this meant. Depending on whom they spoke to first they could think it was hosted managed software as a service, or they could think it was on-premise dynamic infrastructure—or many other permutations between. My job was created to talk to them at the early stage, explain the full scope of what it means, to look at their business requirements and workloads and align them to the most appropriate solution.

Three years later you would hope that it’s all a lot clearer and in many ways it is, but there are still preconceptions that need to be explained, and also the cloud technologies themselves are moving so rapidly that it’s hard enough for people like me to stay abreast of it, let alone the customers.

To begin, I noticed some fairly obvious differences, many of which still hold. The large financial institutions wanted to keep their data on premise, and they had large enough IT departments that it made sense for them to buy the hardware and software to effectively act as cloud service providers to their lines of business. Some investment banks saw their technology as a key differentiator and asked that I treat them as a technology company rather than a bank, so they didn’t want to give away the ownership of IT, the attributes of cloud that they were looking for were standardisation, optimisation and virtualisation.

On the other hand I was working with retail companies and startups who saw IT as an unnecessary cost, a barrier to their innovation.  They saw cloud as a form of outsourcing, where a  service provider could take on the responsibility of looking after commodity IT and let them focus on their core business.

A third industry is government and public sector. This is very different in the UK to other countries. In the United States, the government is investing in large on-premise cloud solutions, and this avoids many of the security and scalability issues. In the UK, with a new government following the global financial crisis, there is an austerity programme, which led to the Government ICT Strategy and Government Digital Strategy and the announcement of the Cloud First Policy. This requires that government bodies use hosted, managed cloud offerings, as well as encouraging the use of open source and small British providers.

The British Parliament and Big Ben

The British Parliament and Big Ben (Photo credit: ** Maurice **)

Our health sector is also very different to the U.S., with our public sector National Health Service being one of the largest employers in the world, whereas in the U.S. health has much more of an insurance focus.

Over the years in all industries there has been a lot of fear, uncertainty and doubt about the location of data and whether or not there are regulations that make this an issue. I’m glad to say that we’ve now worked through a lot of this and it’s a lot clearer to both the providers and the consumers.

In practice most of the cloud investment that happened was infrastructure as a service (IaaS). Much of this was private cloud, with some usage of public cloud IaaS.

We used to have a lot of interest from customers, whether they be meteorological or academic research, looking for high performance computing clouds. This made a lot of sense, as the hardware required for this is very expensive and some customers only need it for short periods of time, so to have it available on a pay as you go basis was very attractive. Last year, IBM acquired SoftLayer, which includes bare metal IaaS as well  as virtualised. This means that HPC cloud is more attainable and with this has come a change of perception of cloud from virtualisation and closer to the hosted, utility based pricing view.

The big change this year is the move from IaaS to platform as a service (PaaS). With the nexus of forces of mobile devices (phones, tablets, wearable devices, internet of things), social media generating large amounts of unstructured data, and high performance broadband, there is a new demand and ability to deliver cloud based mobile apps connecting and exploiting data from multiple sources. This reflects a shift in the IT industry from the systems of record, which store the traditional, fairly static, structured data, to the new systems of engagement, which are much more about the dynamic customer interface and access to the fast changing data.

Developers are becoming key decision makers. They often work in the line of business and want to create business solutions quickly, without the blocker of the traditional IT department. Optimising the speed to market of business solutions by using IaaS, devops has been the first step in this. Now customers are looking to PaaS to give them immediate access to the whole software development environment of infrastructure as well as the necessary middleware for developing, testing, and delivering solutions quickly and reliably with minimal investment. This also includes the new open source middlewares and hyperpolyglot languages.

Finally, SaaS. We are talking to many companies, public sector bodies, and education establishments, who want to become entirely IT free. They don’t want a data centre and they don’t want developers. This requirement is now becoming achievable as IBM and others are committed to making a significant proportion of their offerings available as SaaS solutions. Of course, this brings new challenges around hybrid cloud integration and federated security.

Do my views of the trends in UK cloud align to yours? I’d love to see your comments on this.

It’s all about the speed: DevOps and the cloud

This post was originally published on ThoughtsOnCloud on April 29th, 2014.

As I explained in my earlier blog post, “Cloud accelerates the evolution of mankind,” I believe that cloud has the power to change the world. It achieves this by giving us speed—and this has an exponential effect.

DevOps and the cloudWhen I first became a professional software developer in the late 1980s, we spent two years designing and writing a software product that our team thought was what the market wanted. We went through a very thorough waterfall process of design (code, unit test, functional verification test, system test, quality assurance) and eventually released the product through sales and marketing. This cost us millions and eventually sold 14 copies.

More recently we’ve begun to adopt lean principles in software innovation and delivery to create a continuous feedback loop with customers. The thought is to get ideas into production fast, get people to use the product, get feedback, make changes based on the feedback and deliver the changes to the user. We need to eliminate any activity that is not necessary for learning what the customers want.

Speed is key. The first step was to move from waterfall to a more iterative and incremental agile software development framework. After that, the biggest delay was the provisioning of the development and test environments. Citigroup found that it took an average of 45 days to obtain space, power and cooling in the data center, have the hardware delivered and installed, have the operating system and middleware installed and begin development.  Today, we replace that step with infrastructure as a service (IaaS).

The next biggest delays in the software development lifecycle are the handovers. Typically a developer will work in the line of business. He will write, build and package his code and unit test it. He then needs to hand it over to the IT operations department to provide a production-like environment for integration, load and acceptance testing. Once this is complete the developer hands it over completely to the operations department to deploy and manage it in the production environment. These handovers inevitably introduce delay and also introduce the chance for errors as the operations team cannot have as complete an understanding of the solution as the developer. Also there will be differences between each environment and so problems can still arise with the solution in production.

By introducing a DevOps process, we begin to merge the development and operations teams and give the power to the developer to build solutions that are stable and easy for IT operations to deliver and maintain. Delivery tasks are tracked in one place, continuous integration and official builds are unified and the same deployment tool is used for all development and test environments so that any errors are detected and fixed early. With good management of the deployable releases, development can be performed on the cloud for provisioning to an on-premises production environment or the reverse; solutions can quickly be up and running in the cloud and as their usage takes off it may prove economical to move them on premises.

Of course there is risk in giving the power to the developer. The handover delays happen for a reason—to ensure that the solution is of sufficient quality to not break the existing environment. This is why the choice of tooling is so crucial. The IBM UrbanCode solution not only automates the process but provides the necessary governance.

Application-release-management

As I discussed in my blog post “Cloud’s impact on the IT team job descriptions,” introducing IaaS to this means that the role of the IT operations department is reduced. They may still be needed to support the environment, especially if it is a private cloud, but the cloud gives self service to the developer and tester to create and access production-like environments directly. It brings patterns to automatically create and recreate reproducible infrastructure and middleware environments.

In my next post I will discuss the next step that can be taken to increase the speed of the development cycle:platform as a service (PaaS). I’d love to hear what do you think are the benefits of DevOps with the cloud and other ways to accelerate delivery? Please leave a comment below.

Enhanced by Zemanta

Cloud With DevOps Enabling Rapid Business Development

My point of view on accelerating business development with improved time to market by using lean principles enabled by devops and cloud.

What is Cloud Computing? Is everything cloud?

Cloud wordleCloud is consumption model. It’s the idea of taking away all the IT skills and effort required by a user and letting them focus on their actual functional requirements. All the IT detail is hidden from them in The Cloud. Smart phones and tablets have really helped consumers understand this concept. They’ve become liberated. Knowing very little about IT they have become empowered with self-service IT to access functionality on demand. Within seconds they can decide that they want a business application, they can find it on an app store, buy and install it themselves and be up and running using it. When they’ve finished they can delete it.

CIOs are asking themselves why it can still take IT many months to get their business project up and running when in their personal lives they can have what they want when they want it.

The Cloud doesn’t take away the need for IT; for hardware, software, and systems management. It just encapsulates it. It puts it in the hands of the specialists working inside the cloud, and by centralising the IT and the skills costs can be reduced, risk can be reduced, businesses can focus on their core skills and have improved time to market and business agility.

It is confusing to talk about cloud without explaining whose point of view you’re looking at it from. Different people want different levels of complexity outsourced to the cloud.

Many users see cloud as a way of outsourcing all their IT. Some go even further and outsource the whole business process. I think the jury is out on whether cloud has to involve IT at all. Business Process as a Service (BPaaS) is talked about as one of the cloud offerings. I think the important thing is to let the customer get on with their core business and take away any activity that is not a differentiator for them.

Software as a Service (SaaS) is the area that most people think about first when they hear the word cloud. People have been using web based email for over 10 years. They don’t need to worry about maintaining a high spec PC and all the associated software. As long as they have a web browser they’re up and running. There is a move and a demand to make many, if not all, computer software applications available on the cloud, via simple consoles. Not unlike the idea of thin clients 15 years ago or mainframe terminals 40 years ago.

Moving down the stack a little further we come to a different group of users; the application developers. The people who want to be involved in IT, who want to create the business applications that run on the cloud. They still want to focus on business value though. They still want someone to take away the effort of writing the middleware. The code that is the same in 90% of all applications. The communication systems, the database, the interaction with the user. They want Platform as a Service (PaaS). An environment that’s just there, up and running, as and when they need it.

Finally we come to Infrastructure as a Service (IaaS). This is for real programmers or system administrators. For people who just want the base operating system to install or write the applications on, like they did in the old days. These people like the paradigm, of having a computer that’s all theirs. In the old days when their CIO wanted an environment for a new project they would request that someone find some data centre space, buy a PC, install it in the data centre with power and cooling etc, install the operating system, and then 6 months later hand it over to them to start the project development. Now they don’t need to worry about the physical world. They can just request the infrastructure as a service i.e. access to a brand new operating system install, and they’ll be up and running in minutes.

Or course these things can all run on top of one another. The business process can run on the software which runs on the platform which runs on the infrastructure, all provided as a service. But they don’t have to. The whole point is that the user doesn’t need to worry about what’s happening inside of their cloud. There could just be an infinite number of monkeys with an infinite number of typewriters working away inside the cloud. As long as the user is getting the service that they’re looking for they don’t care.

Which brings us to the other side of the picture. The cloud service providers. These can be traditional Managed Service Providers (MSPs), System Integrators, or the in house data centre offering the IT service to the lines of business. These guys are already taking the IT effort away from businesses, they’re already encapsulating and obfuscating the details of IT. But they’re in a competitive market, driven by the new expectations of the consumer and so they need to work smarter. They need to adopt some of these new architectures to be able to pass on the cost benefits and speed of delivery that their customer expects.

This is where some of the other terms associated with cloud computing come in – virtualisation, automation, standardisation. They’re not essential for cloud computing. The monkeys could do the job. But they really make it a lot easier. To make a step change improvement in delivery speed the IT departments need to share the environments on the same computer. Instead of having hundreds of servers running at 50% capacity they can just have one bigger one and schedule who’s using the capacity when. Instead of manually installing the application and all its dependencies and bug fixing and testing each part individually and together, they can standardise and automate and use virtual appliances to remove room for error. By introducing virtualisation, automation and self service a private data centre is moving towards and enabling cloud. Similarly, pay as you go, and sharing services between companies, are not cloud per se, but they are drivers towards and benefits from cloud.

So it can be confusing. People are talking about the same thing, but from different points of view. When people talk about cloud they might be talking about the hardware and automation in the data centre, or they might be talking about the complete absence of hardware by using business process outsourcing, they might be talking about handing all their data over to another company or they might be talking about making their private data more accessible to their own users.

So cloud covers a lot, but not everything.