Connecting Salesforce to a Heroku Database

A popular use of Salesforce is as a front end system of engagement application, using a lookup integration to the more static system of record data in a back-end such as SAP. I wanted to set up a demo to show this but I didn’t have access to an SAP environment so I decided to simulate the back-end by creating a Postgres database in Heroku.

I decided to use some publicly available open data on properties as the database and for this to be looked up dynamically from the account record in Salesforce.

Here are the steps I went through.

First get a Heroku account. Go to heroku.com and ‘sign up for free’

image001

Then login.
On my computer I installed node.js from https://nodejs.org/

And npm from https://github.com/npm/npm

And the Heroku toolbelt from https://devcenter.heroku.com/articles/getting-started-with-nodejs#set-up

Once installed, you can use the heroku command from your command shell. Log in using the email address and password you used when creating your Heroku account:

image003

Execute the following commands to clone the sample application:

image004

You now have a functioning git repository that contains a simple application as well as apackage.json file, which is used by Node’s dependency manager.

image005

Now we create an app on Heroku, which prepares Heroku to receive the source code.

image006

When you create an app, a git remote (called heroku) is also created and associated with your local git repository.

Heroku generates a random name (in this case glacial-sierra-8855) for your app.

The package.json file determines both the version of Node.js that will be used to run your application on Heroku, as well as the dependencies that should be installed with your application. When an app is deployed, Heroku reads this file and installs the appropriate node version together with the dependencies using the npm install command.

Run this command in your local directory to install the dependencies, preparing your system for running the app locally:

image007

Now we add a free Heroku Postgres Starter Tier dev database to the app.

image008

Now we need to add Postgres to the path:

image009

I found some sample data of properties here.

Download the CSV file to your current directory and delete first line so there’s only one header.

Now add an index column; so in Excel insert a new column, put 1 and 2 as the first two entries and then highlight these and drag the mouse down so that it auto populates the other records with increasing indices:

image010

image012

Now we create the database table in the same format as the spreadsheet.

image013

Run a select statement to check it worked:

image014

Set the date format to be UK format like the spreadsheet.

image015

Then copy the spreadsheet into the database:

image016

So now we have an app, a database and a table. In order to access it from Salesforce we need to request that the database be enabled as an external object by raising a ticket at https://help.heroku.com/

You will need to provide Support with the application name (in this case glacial-sierra-8855).

Once external object support has been enabled go to https://connect.heroku.com/

and set the app as a Heroku Connect Instance and get the username, password and url for it.

Select the ‘emptyproperties’ data source to share:

image017

Now, to connect to this from Salesforce you need a Salesforce org with Lightning Connect/OData enabled. If you request a developer environment from here then you will get this functionality automatically https://developer.salesforce.com/signup

In your Salesforce org:

  • Click Setup (upper right corner)
  • Click Develop > External Data Sources (left navigation)
  • Click New External Data Source
  • Enter OrderDB as the Label. As you click or tab away from the label field, the Name field should automatically default to OrderDB.
  • Select Lightning Connect: OData 2.0 as the Type.

(OrderDB doesn’t have to be the name, choose something meaningful for you).

Enter the url, username and password from heroku

image019

  • click ‘validate and sync’
  • and select the ‘emptyproperties’ table
  • and select ‘sync’

image021

Then click into the ‘emptyproperties’ external object

image023

You should be able to see that all the fields have been picked up from the database:

image025

You can now create a Custom Tab to Easily Access properties

  • Click Setup (upper right corner)
  • Click Create > Tabs
  • Click the New button next to Custom Object Tabs.
  • Select properties as the Object.
  • Click the selector next to Tab Style and choose whichever style you like.
  • Click Next.
  • Click Next to accept the default tab visibility settings.
  • Choose the apps that you want the tab to be included in.
  • Click Save.

image027

Now there will be a new tab:

image029

Click on ‘go’ to view all. All the records are now accessible in the database by clicking through the external ids.

image031

Now go back to ‘external objects’ where you were just before creating the tab.

Now we want to make the index an external lookup. Click ‘edit’ next to index.

image033

Select “change field type”:

image035

  • Select External Lookup Relationship and click Next. An external lookup relationship can link any object to an external object.
  • Select emptyproperties as the value of Related To and click Next.

image037

  • Enter4 as the value for Length and click Next.
  • Enable theVisible checkbox to make the relationship visible to all profiles, and click Next.
  • Click Save to accept the defaults – you definitely want an ‘OrderDetails’ related list on the Orders page layout!

Now go to the properties tab and select an external id and the full property detail is displayed:

image039

Now let’s assign properties to accounts. Se we’ll edit the account record and add a property

  • Setup, customize, accounts, fields
  • New custom field
  • External lookup
  • Select the properties but change the field label just to Property
  • Step through and save it
  • Now when we go to an account we see an empty field for Property

image041

If we edit the field and put an index in it becomes a link to the Heroku properties database

image043

Now the property also shows a link back to the account from the property:

image045

And that’s it. We now have accounts in the Salesforce CRM system with real-time lookups to the system of record in a Heroku database.

A Guide to IBM Bluemix Resiliency and Security

This post was originally published on ThoughtsOnCloud on February 7th, 2015.

I’m pleased to say that it was also published for the 20,000 attendees at IBM Interconnect on Feb 26th.

B-yCORyUcAA8Gq1IBM Bluemix is suitable for high performance, high input/output (I/O), high availability or latency-sensitive production applications, as well as development and test deployments. This is due to the IBM Bluemix configuration of Cloud Foundry within its data centers and the underlying strength of the IBM SoftLayer cloud infrastructure platform.

All Bluemix applications have their infrastructure automatically deployed as required and in real time. For example, if an application is dynamically scaled because it requires extra capacity, Bluemix handles it automatically. There is a full web-based management console and programmable management interfaces, which enable completely flexible monitoring of users’ applications.

IBM Bluemix configures Cloud Foundry in a highly available topology within the IBM SoftLayer data center. All Cloud Foundry components have been replicated to avoid any single point of failure (SPOF). These components include Droplet Execution Agent (DEA), Cloud Controller, router, Health Manager and login server. If any component fails it will be restarted within the data center while the remaining components provide continued availability. Other deployments can become available for the purposes of disaster recovery for IBM Bluemix applications.

IBM Bluemix exploits the IBM SoftLayer cloud infrastructure platform, hosted in data centers with Tier 3 resiliency. IBM SoftLayer provides a compelling set of service level agreements (SLAs) which in turn provide a strong platform for IBM Bluemix technology.

IBM Bluemix is able to exploit IBM SoftLayer’s triple network, which isolates public Internet, private application traffic and infrastructure management traffic. Together with highly redundant servers, each of which has five network cards, and the ability to seamlessly integrate with secure client private networks, IBM Bluemix applications benefit from a highly available and resilient network.

A large catalog of application services is available, each of which typically provides an appropriate range of priced service levels. The service plan will document a priced service level as well as the free service tier. While the free tier provides the ability for developers to try out the functional behavior, the priced levels provide increasing operational quality of service. This

service plan is fully documented with the details of the service performance and capacity, as well as specifying high availability and disaster recovery options. This flexible service approach enables departments to match their development and operations with the appropriate service plan to ensure the most economical mix of service levels.

The IBM approach to information assurance is to provide evidence according to government security principles. IBM Bluemix and its underlying cloud platform infrastructure, IBM SoftLayer, are designed to comply with these 14 principles for all security elements including people, process and technology.

The IBM SoftLayer cloud infrastructure platform has already demonstrated compliance with SOC2 Type II, EU Safe Harbor, and CSA STAR CAIQ and CCM self-assessments, as well as the ISO 9000 quality assurance standard. These standards represent the ongoing commitment to the European Commission data privacy requirements.

From an engineering and support perspective, IBM Bluemix and its underlying cloud infrastructure technologies undergo continuous rigorous security testing in accordance with IBM Secure Engineering development practices. If a security exposure is identified by IBM or a third party, then IBM Support will use the IBM Product Security Incident Response Team (PSIRT) process to apply appropriate and timely updates to ensure the overall system security and integrity is maintained.

As you can see, the security and compliance offered by Bluemix is attractive and comprehensive. Do you think Bluemix is right for you?

Sending SMS messages using Twilio and Bluemix

samjgarforth:

Here’s an excellent post on setting up a Bluemix app to send SMS messages.

Originally posted on Martin Gale's blog:

I’ve been tinkering with an Internet of Things project at home for a while which I’ll write up in due course, but in the course of doing so have knocked up a few useful fragments of function that I thought I’d share in case other people need them. The first of these is a simple Node.js app to send an SMS message via Twilio using IBM Bluemix.

There’s lots of material on Twilio and Bluemix but by way of a very rapid summary, Twilio provide nice, friendly APIs over telephony-type services (such as sending SMS messages), and Bluemix is IBM’s Cloud Foundry-based Platform-as-a-Service offering to enable developers to build applications rapidly in the cloud. Twilio have created a service within Bluemix that developers can pick up and use to enable their applications with the Twilio services. One of the things I wanted for my application was a way of notifying me that…

View original 805 more words

Sam’s Views on Cloud for Government Policy Makers

I was honoured to be asked to present yesterday on “Cloud Skills, Flexibility and Strategy” at the Westminster eForum Keynote Seminar: Next steps for cloud computing.

English: The Palace of Westminster from Whitehall.

English: The Palace of Westminster from Whitehall. (Photo credit: Wikipedia)

As explained on its website, Westminster Forum Projects enjoys substantial support and involvement from key policymakers within UK and devolved legislatures, governments and regulatory bodies and from stakeholders in professional bodies, businesses and their advisors, consumer organisations, local government representatives, and other interested groups. The forum is structured to facilitate the formulation of ‘best’ public policy by providing policymakers and implementers with a sense of the way different stakeholder perspectives interrelate with an aim is to provide policymakers with context for arriving at whatever decisions they see fit.

The abstract to the session asked about the extent to which Government departments embracing the cloud, what progress is being made in achieving the UK’s Data Capability Strategy on skills and infrastructure development, whether organisations are doing enough to address the emerging shortfall in skills and also asked about the contradiction between mobile device power and cloud.

I was part of a panel and the following was my five minute introduction.

In my five minutes I’d like to talk about the power of cloud and within that to address three areas raised in the abstract to this session – shared services and shared data; mobile; and skills.

We see cloud as being used in three different ways – optimisation, innovation and disruption. Most of what I’ve seen so far in cloud adoption is about optimisation or cost saving. How to use standardisation, automation, virtualisation and self service to do the same things cheaper and faster.

What’s more interesting is the new things that can be achieved with the innovation and disruption that this can provide.

I’ve been working with various groups – local authorities, police forces, and universities, discussing consolidating their data centres. Instead of each one managing their own IT environment, they can share it in a cloud. They justify this with the cost saving argument but the important thing is, firstly, that they can stop worrying about IT and focus on what their real role is, and secondly that by putting their data together in a shared environment they can achieve things that they’ve never done before.

English: The road to Welton, East Riding of Yo...

English: The road to Welton, East Riding of Yorkshire, just south of Riplingham. Taken on the Riplingham to Welton road at MR: SE96293086 looking due south. This is typical south Yorkshire Wolds country. (Photo credit: Wikipedia)

For example, Ian Huntley would never have been hired as a caretaker and so the Soham murders would have been less likely to happen if the police force had access to the data that he was known by a different force.

And we wouldn’t have issues with burglars crossing the border between West and North Yorkshire to avoid detection if data was shared.

In Sunderland we predict £1.4m per year in cost savings by optimising their IT environment but what’s more important is that this has helped to create a shared environment for start up companies to get up and running quickly so it’s stimulating economic growth in the area.

Another example is Madeleine McCann. After her disappearance it was important to collect holiday photos from members of the public as quickly as possible. Creating a website for this before cloud would have taken far too long. Nowadays it can be spun up very quickly. This isn’t about cost saving and optimisation, it’s about achieving things that could never have been done before.

This brings me to the question in the abstract about mobile: “As device processing power increases, yet cloud solutions rely less and less on that power, is there a disconnect between hardware manufacturers and app and software developers”. I think this is missing the point. Cloud isn’t about shifting the processing power from one place to another, it’s about doing the right processing in the right place.

English: GPS navigation solution running on a ...

English: GPS navigation solution running on a smartphone (iphone) mounted to a road bike. GPS is gaining wide usage with the integration of GPS sensors in many mobile phones. (Photo credit: Wikipedia)

In IBM we talk about CAMS – the nexus of forces of Cloud, Analytics, Mobile and Social, and we split the IT into Systems of Record and Systems of Engagement. The Systems of Record are the traditional IT – the databases that we’re talking about moving from the legacy data centres to the cloud. And, as we’ve discussed, putting it into the cloud means that a lot of new analytics can happen here. With mobile and social we now have Systems of Engagement. The devices that interact with people and the world. The devices that, because of their fantastic processing power, can gather data that we’ve never had access to before. These devices mean that it’s really easy to take a photo of graffiti or a hole in the road and send it to the local council through FixMyStreet and have it fixed. It’s not just the processing power, it’s the instrumentation that this brings. We now have a GPS location so the council know exactly where the hole is. And of course this makes it a lot easier to send photos and even videos of Madeleine McCann to a photo analytics site.

We’re also working with Westminster council to optimise their parking. The instrumentation and communication from phones helps us do things we’ve never done before, but then we move onto the Internet of Things and putting connected sensors in parking spaces.

With connected cars we have even more instrumentation and possibilities. We have millions of cars with thermometers, rain detection, GPS and connectivity that can tell the Met Office exactly what the weather is with incredible granularity, as well as the more obvious solutions like traffic optimisation.

Moving on to talking about skills. IBM has an Academic Initiative where we give free software to universities, and work with them on the curriculum and even act as guest lecturers. With Imperial College we’re proving cloud based marketing analytics software as well as data sets and skills, so that they can focus on teaching the subject rather than worrying about the IT. With computer science in school curriculums changing to be more about programming skills we can offer cloud based development environments like IBM Bluemix. we’re working with the Oxford and Cambridge examination board on their modules for cloud, big data and security.

Classroom 010

Classroom 010 (Photo credit: Wikipedia)

To be honest, it’s still hard. Universities are a competitive environment and they have to offer courses that students are interested in rather than ones that industry and the country need. IT is changing so fast that we can’t keep up. Lecturers will teach subjects that they’re comfortable with and students will apply for courses that they understand or that their parents are familiar with. A university recently offered a course on social media analytics, which you’d think would be quite trendy and attractive but they only had two attendees. It used to be that universities would teach theory and the ability to learn and then industry would hire them and give them the skills, but now things are moving so fast that industry doesn’t have the skills and is looking for the graduates to bring them.

Looking at the strategy of moving to the cloud, and the changing role of the IT department, we’re finding that by outsourcing the day to day running of the technology there is a change in skills needed. It’s less about hands on IT and more about architecture, governance, and managing relationships with third party providers. A lot of this is typically offered by the business faculty of a university, rather than the computing part. We need these groups to work closer together.

To a certain extent we’re addressing this with apprenticeships. IBM’s been running an apprenticeship scheme for the last four years This on the job training means that industry can give hands on training with the best blend of up to the minute technical, business and personal skills and this has been very effective, with IBM winning the Best Apprenticeship Scheme from Target National Recruitment Awards and National Apprenticeship Services and Everywoman in technology.

In summary, we need to be looking at the new things that can be achieved by moving to cloud and shared services; exploiting mobile and the internet of things; and training for the most appropriate skills in the most appropriate way.

Using a Cloudant database with a BlueMix application

I wanted to learn how to use the Cloudant database with a BlueMix application. I found this great blog post Build a simple word game app using Cloudant on Bluemix by Mattias Mohlin. I’ve been working through it.

image001

I’ve learned a lot from it – as the writer says “I’ll cover aspects that are important when developing larger applications, such as setting up a good development environment to enable local debugging. My goal is to walk you through the development of a small Bluemix application using an approach that is also applicable to development of large Bluemix applications.” So it includes developing on a PC and also setting up Cloudant outside of BlueMix.

So here’s my simplified version focusing purely on getting an application up and running using a Cloudant BlueMix service and staying in DevOps Services as much as possible.

The first step is to take a copy of Mattias’s code so go to the GuessTheWord DevOps Services project.

click on “Edit Code” and then “Fork”

image003

I chose to use the same project name GuessTheWord – in DevOps Services it will be unique as it’s in my project space.

image005

This takes me into my own copy of the project so I can start editing it.

I need to update the host in the manifest file otherwise the deployment will conflict with Mattias’s. So in my case I change it to GuessTheWordGarforth but you’ll need to change it to something else otherwise yours will clash with mine. Don’t forget to save the file with Ctrl-S, or File/Save or at least changing file.

image007

Now I need to set up the app and bind the database on BlueMix so I click on “deploy”. I know it won’t run but it will start to set things up.

At this point I logged onto BlueMix itself for the first time and located the new GuessTheWord in the dashboard.

image009

I clicked on it and selected “add a service” and then scrolled down to the Cloudant NoSQL DB

image011image013

and click on it. I clicked on “create” and then allowed it to restart the application. Unsurprisingly it still did not start as there is more coding to do. However the Cloudant service is there so I clicked on “Show Credentials” and saw that the database has  username, password, url etc so the registration etc on the Cloudant site is not necessary as this is all handled by BlueMix.

image015image017Clicking on Runtime on the left and then scrolling down to Environment variables I can see that these Cloudant credentials have been set up as VCAP_SERVICES environment variables for my app. So I just need to change the code to use these.

I switch back to DevOps Services and go to the server.js file to modify the code for accessing this database.

I change line 27 from
Cloudant = env[‘user-provided’][0].credentials;
to
Cloudant = env[‘CloudantNoSQLDB’][0].credentials;

So we’re providing the high level environment variable not the name or the label.

Unfortunately there is also an error in Mattias’s code. I don’t know whether the BlueMix Cloudant service has changed since he wrote it but he builds the url for the database by adding the userid and password to it but actually these are already in my environment variable url

so I change line 30 from

var nano = require(‘nano’)(‘https://’ + Cloudant.username + ‘:’ + Cloudant.password + ‘@’ + Cloudant.url.substring(8));
to simply
var nano = require(‘nano’)(Cloudant.url);

Now save the file and click deploy. When it’s finished a message pops up saying see manual deployment information in the root folder page.

image019

So I click on that and hopefully see a green traffic light in the middle.

image021

Click on the GuessTheWord hyperlink and should take you to the working game which in my case is running at

http://guessthewordgarforth.mybluemix.net/

image023

However there are still no scores displayed as there is no database table or entries.

I spent a long time trying to do this next part in the code but eventually ran out of time and had to go through the Cloudant website. If anyone can show me how to do this part in code I’d really appreciate it.

So for now, go to the GuessTheWord app on BlueMix and click on the running Cloudant service

image025

From here you get to a Launch button

image027

Pressing this logs you on to the Cloudant site using single sign on

image029

Create a new database named guess_the_word_hiscores. Then click the button to create a new secondary index. Store it in a document named top_scores and name the index top_scores_index. As Mattias says, the map function defines which objects in the database are categorised by the index and what information we want to retrieve for those objects. We use the score as the index key (the first argument to emit), then emit an object containing the score, the name of the player, and the date the score was achieved. Following is the JavaScript implementation of the map function, which we need to add before saving and building the index.

function(doc) {
emit(doc.score, {score : doc.score, name : doc.name, date : doc.date});
}

image031

Again, we should really be able to do the following as part of the program startup but anyway, the following should add an entry to the database, replacing guessthewordgarforth in the URL with the host name you chose for your application:

http://guessthewordgarforth.mybluemix.net/save_score?name=Bob&score=4

You should see a success message. Enter the following URL, again replacing guessthewordgarforth with your application host name.

http://guessthewordgarforth.mybluemix.net/hiscores

The entry you just added should appear encoded in JSON e.g.

[{“score”:4,”name”:”Bob”,”date”:”2014-08-07T14:27:34.553Z”}]

So, the code and the database are working correctly. Now it just remains to play the game. Go to

http://guessthewordgarforth.mybluemix.net

(replacing guessthewordgarforth with your hostname)

This time it will include Bob in the high score table

image033

and click on “Play!”

game

Cloud computing trends in the UK: IaaS, PaaS & SaaS

This post was originally published on ThoughtsOnCloud on June 17th, 2014.

I’ve been a cloud architect foEnglish: Flats on Deansgate with cloud. Manche...r the last three years or so and have seen dramatic changes in the IT industry and its view of cloud. I’ve also observed different attitudes to cloud in different industries and countries.

I took on the cloud architect role because I saw that customers were asking about cloud, but they all had different ideas of what this meant. Depending on whom they spoke to first they could think it was hosted managed software as a service, or they could think it was on-premise dynamic infrastructure—or many other permutations between. My job was created to talk to them at the early stage, explain the full scope of what it means, to look at their business requirements and workloads and align them to the most appropriate solution.

Three years later you would hope that it’s all a lot clearer and in many ways it is, but there are still preconceptions that need to be explained, and also the cloud technologies themselves are moving so rapidly that it’s hard enough for people like me to stay abreast of it, let alone the customers.

To begin, I noticed some fairly obvious differences, many of which still hold. The large financial institutions wanted to keep their data on premise, and they had large enough IT departments that it made sense for them to buy the hardware and software to effectively act as cloud service providers to their lines of business. Some investment banks saw their technology as a key differentiator and asked that I treat them as a technology company rather than a bank, so they didn’t want to give away the ownership of IT, the attributes of cloud that they were looking for were standardisation, optimisation and virtualisation.

On the other hand I was working with retail companies and startups who saw IT as an unnecessary cost, a barrier to their innovation.  They saw cloud as a form of outsourcing, where a  service provider could take on the responsibility of looking after commodity IT and let them focus on their core business.

A third industry is government and public sector. This is very different in the UK to other countries. In the United States, the government is investing in large on-premise cloud solutions, and this avoids many of the security and scalability issues. In the UK, with a new government following the global financial crisis, there is an austerity programme, which led to the Government ICT Strategy and Government Digital Strategy and the announcement of the Cloud First Policy. This requires that government bodies use hosted, managed cloud offerings, as well as encouraging the use of open source and small British providers.

The British Parliament and Big Ben

The British Parliament and Big Ben (Photo credit: ** Maurice **)

Our health sector is also very different to the U.S., with our public sector National Health Service being one of the largest employers in the world, whereas in the U.S. health has much more of an insurance focus.

Over the years in all industries there has been a lot of fear, uncertainty and doubt about the location of data and whether or not there are regulations that make this an issue. I’m glad to say that we’ve now worked through a lot of this and it’s a lot clearer to both the providers and the consumers.

In practice most of the cloud investment that happened was infrastructure as a service (IaaS). Much of this was private cloud, with some usage of public cloud IaaS.

We used to have a lot of interest from customers, whether they be meteorological or academic research, looking for high performance computing clouds. This made a lot of sense, as the hardware required for this is very expensive and some customers only need it for short periods of time, so to have it available on a pay as you go basis was very attractive. Last year, IBM acquired SoftLayer, which includes bare metal IaaS as well  as virtualised. This means that HPC cloud is more attainable and with this has come a change of perception of cloud from virtualisation and closer to the hosted, utility based pricing view.

The big change this year is the move from IaaS to platform as a service (PaaS). With the nexus of forces of mobile devices (phones, tablets, wearable devices, internet of things), social media generating large amounts of unstructured data, and high performance broadband, there is a new demand and ability to deliver cloud based mobile apps connecting and exploiting data from multiple sources. This reflects a shift in the IT industry from the systems of record, which store the traditional, fairly static, structured data, to the new systems of engagement, which are much more about the dynamic customer interface and access to the fast changing data.

Developers are becoming key decision makers. They often work in the line of business and want to create business solutions quickly, without the blocker of the traditional IT department. Optimising the speed to market of business solutions by using IaaS, devops has been the first step in this. Now customers are looking to PaaS to give them immediate access to the whole software development environment of infrastructure as well as the necessary middleware for developing, testing, and delivering solutions quickly and reliably with minimal investment. This also includes the new open source middlewares and hyperpolyglot languages.

Finally, SaaS. We are talking to many companies, public sector bodies, and education establishments, who want to become entirely IT free. They don’t want a data centre and they don’t want developers. This requirement is now becoming achievable as IBM and others are committed to making a significant proportion of their offerings available as SaaS solutions. Of course, this brings new challenges around hybrid cloud integration and federated security.

Do my views of the trends in UK cloud align to yours? I’d love to see your comments on this.

It’s all about the speed: DevOps and the cloud

This post was originally published on ThoughtsOnCloud on April 29th, 2014.

As I explained in my earlier blog post, “Cloud accelerates the evolution of mankind,” I believe that cloud has the power to change the world. It achieves this by giving us speed—and this has an exponential effect.

DevOps and the cloudWhen I first became a professional software developer in the late 1980s, we spent two years designing and writing a software product that our team thought was what the market wanted. We went through a very thorough waterfall process of design (code, unit test, functional verification test, system test, quality assurance) and eventually released the product through sales and marketing. This cost us millions and eventually sold 14 copies.

More recently we’ve begun to adopt lean principles in software innovation and delivery to create a continuous feedback loop with customers. The thought is to get ideas into production fast, get people to use the product, get feedback, make changes based on the feedback and deliver the changes to the user. We need to eliminate any activity that is not necessary for learning what the customers want.

Speed is key. The first step was to move from waterfall to a more iterative and incremental agile software development framework. After that, the biggest delay was the provisioning of the development and test environments. Citigroup found that it took an average of 45 days to obtain space, power and cooling in the data center, have the hardware delivered and installed, have the operating system and middleware installed and begin development.  Today, we replace that step with infrastructure as a service (IaaS).

The next biggest delays in the software development lifecycle are the handovers. Typically a developer will work in the line of business. He will write, build and package his code and unit test it. He then needs to hand it over to the IT operations department to provide a production-like environment for integration, load and acceptance testing. Once this is complete the developer hands it over completely to the operations department to deploy and manage it in the production environment. These handovers inevitably introduce delay and also introduce the chance for errors as the operations team cannot have as complete an understanding of the solution as the developer. Also there will be differences between each environment and so problems can still arise with the solution in production.

By introducing a DevOps process, we begin to merge the development and operations teams and give the power to the developer to build solutions that are stable and easy for IT operations to deliver and maintain. Delivery tasks are tracked in one place, continuous integration and official builds are unified and the same deployment tool is used for all development and test environments so that any errors are detected and fixed early. With good management of the deployable releases, development can be performed on the cloud for provisioning to an on-premises production environment or the reverse; solutions can quickly be up and running in the cloud and as their usage takes off it may prove economical to move them on premises.

Of course there is risk in giving the power to the developer. The handover delays happen for a reason—to ensure that the solution is of sufficient quality to not break the existing environment. This is why the choice of tooling is so crucial. The IBM UrbanCode solution not only automates the process but provides the necessary governance.

Application-release-management

As I discussed in my blog post “Cloud’s impact on the IT team job descriptions,” introducing IaaS to this means that the role of the IT operations department is reduced. They may still be needed to support the environment, especially if it is a private cloud, but the cloud gives self service to the developer and tester to create and access production-like environments directly. It brings patterns to automatically create and recreate reproducible infrastructure and middleware environments.

In my next post I will discuss the next step that can be taken to increase the speed of the development cycle:platform as a service (PaaS). I’d love to hear what do you think are the benefits of DevOps with the cloud and other ways to accelerate delivery? Please leave a comment below.

Enhanced by Zemanta

My twitter feed


Follow

Get every new post delivered to your Inbox.