T Minus One – the Accidental Discovery
“Kevin, are you guys still interested in that FB CMS app?” – I asked.
“Ray, Sure, we do. The community has real needs for it”. – Kevin Answered.
That was the start of my love affair of Ruby. It was the winter of 2008/2009. One morning, I went out for the usual walk with my dog. The night before, there was a heavy snowstorm, almost 3 inches. Texas, being a southern state, was ill equipped for snow. There was no truck on the road, spraying salt or sand. Naturally, there was nobody on the street, shoveling snow. The pavement was encased in ice and was as slick as the skating ring.
I decided walking on the grass instead of sliding on the sidewalk. Suddenly, I mis-stepped into an uncovered sprinkler hole. Apparently, my neighbor had a broken sprinkler head, and decided to replace it. With the sudden overnight snowfall, that job was unfinished; and, there was a big hole under the snow.
I limped back home with bruised knees. There were two big holes in my jeans, where my knees would have been. My doctor told me to keep my legs straight and knees unbent for the next three weeks. Thus, I walked like C3PO for the majority of February.
What should I do during that time? Lists of activities floated through my head, watching TV 24 hours a day, playing computer games, listening to music, Facebooking, … Yet, all those seemed to be fads to me, because somebody has either done it or is doing it.
Suddenly, an interesting idea appeared. I remembered that my church’s single group leader asked me to help them write a little Craiglist app for our group back in November 2008, which I silently nodded.
That app had to run from both inside and outside of Facebook, and, would do:
- Group managers could approve and reject new listings, since repetitive listings posted 20,000 times would be annoying. (Role based administration)
- A listing would stay valid for a maximum of 31 days. After expiring, the listing’s owner could reactivate it with group admin’s approval. A job server would take care of the listing’s expiration. (Job server / State Machine)
- Only group members could create and edit listings. Non-member must join our Facebook group prior to posting. (Security / Privacy).
For the next three months, I toiled away at this volunteer project, accumulating over 600+ hours. (Those were not Bunga Bunga nights; rather, late nights.) Next, at our group meeting, an important topic rose, where to host the app?
First, I volunteered to host that app on my Mac Mini Server. But, with 800+ group members, the 4GB memory was stretched to the limit, in light of running a Mongrel cluster, MySQL DB, JIRA, etc.
Furthermore, being a community project, the application data should not be owned by me, e.g. no hosting live / private data on my machine. It turned out to be a smart decision; as 8 months later, my Mac Mini’s boot disk would fail.
Next, we decided to go with a low-cost / dedicated hosting plan with a Florida company. But, their network connection was quite flaky, as it would lock up sometimes, causing 5+ seconds latency in loading a simple element.
We urgently needed an alternative solution! What other choices were available? I decided to explore cloud computing. Being thrift is critical for a community project such as ours. Moreover, paying 30 cents an hour is much more flexible than paying $250 a month.
Our safeguard was to regularly freeze and restart working VMs.
T Zero – The Cloud User Group
While it was fun that we could make snapshots and restart VMs at will in early 2009, I felt the need to work with other people rather than being the lone wolf. Thus, I organized the first cloud computing user group in Dallas/Forth-Worth area in 2009.
The first session was held at Cohabitat, in Uptown, Dallas. It covered topics and offerings by various players, e.g. RightScale, Heroku, Engine Yard, Google Engine, Azure.
At the seminar, I ran into Nick, who was the partner enablement manager @ Salesforce based in Dallas, and fellow ex-BEAer. He convinced me to give their platform a serious look. Running my app on Cloud2: that sounded like an interesting propositions.
With SF’s tutorial PDFs in hand, I experimented with the Force.com platform. Suddenly, I realized that fact: I was repeating other people’s effort. In effect, I was reinventing the wheel. The Force.com already provided a rich & well-defined data-model. Out of the box, there were already Accounts, Assets, Cases, Groups, Products, Users objects.
This led to a natural conclusion. The age of Community Data Model (CDM) has begun. It is faster to build an app by extending the CDM model than write the DB schema from the scratch.
As a programmer, I thought: with a little elbow grease, I could leverage the data model for my own purpose. (Since I was the lone developer for the church project, system admin and db admin tasks were eating up my valuable time. I was working literally 16+ hours a day, developing and maintaining the system. Yeap, it was unpaid work @ finest. So, I needed a way to offload those tasks.)
As the light bulb went off in my head, I realized the need for building a bridge between the SF platform and my app. After that, I can concentrate on my app logic, while benefiting from the CDM. Perhaps, I could quickly prototype multiple applications.
T Plus One – Searching for the Perfect Adapter
Plan is a plan. Now, it is time to turn it into reality. But, locating a Ruby/Salesforce bridge was the hard part.
Since 2008, Github has already become my main CVS repository. (During that time, there were only single digit Salesforce-related projects hosted on GitHub.)
Through Googling, I found a forked copy of the original ‘activesalesforce’ adapter, which rested on the RubyForge. But, that GEM’s API version was still hard-coded to version 11. Also, there was hardly any documentation. Even the in-lined API documentation was bare bone. I had a hard time understanding it, much less use the adapter.
After further Googling, I saw an old tutorial by the “Old Fart Developer”. Further struggling with setting up the GEM took me another week and half. Finally, I got lucky one day and made a successful call to the Salesforce and got back an XML representation of the Account object.
Chatter went into beta in late 2009. The Chatter Developer challenge was on. The competition was building your cool Chatter app and having the chance of winning a MacBook Pro laptop. A few months earlier, I had just purchased my 17” and didn’t need another laptop. But, thoughts of building new tools and working on a new paradigm – utilizing and extending a common object model attracted me. So, I read thoroughly all-available Chatter documentations, although I decided not to enter the competition.
T Plus Two – Friend of a Friend
My other interest was language, as in real human languages. It is just my personal opinion that given the long history, human language is far richer than the computer language. Among them, I liked to revisit French, which I studied in the university.
At Dijon, I became interested in marrying Embedded Devices with Cloud Computing. Several months earlier, at F8 meeting in April 2010, I thought about building an answering machine for social network feeds. Still, I wanted to push the boundary a bit further; I had already devised a plan to broadcast news feeds information onto game consoles, e.g. Xbox and PS3. For that reason, I would buy an Xbox 360. (Although I didn’t game much, I am interested in exploring non-traditional ways for visualization and expressing business data.) Of course, the technical challenge remained immense, as Xbox has a reputation for being a closed system. It is tightly locked-in, guarding against any attempts for sending and receiving information outside of the Live network. Still I had a detailed design for building a software bridge, crossing between the Game Console and the Windows Operating System boundary.
My church project still needed improvements. Building the standalone front-end website was a top agenda. I investigated the Hobo GEM, which is a nifty web app builder for RoR. I became very familiar at writing Hobo tag libraries. From there, I modified my app to extract and modify data in the Salesforce System.
I am a strong believer in the community-driven development approach. Eric Raymond’s book – The Cathedral and the Bazaar, has left a profound impression on my philosophy. So, I am more than happy to share my projects source code publicly, e.g. SFRWatcher, Facebook CMS app, WCI Portal Investigator, …
Now, I remembered Nick and decided to send him some screen shots. Some of those pictures eventually floated to Quinton. Those pictures showed that it was possible to retrieve both Chatter Feed (including attachments) and perform CRUD operations on Salesforce objects, e.g. Account, User, Product2, etc.
One day at FNAC (French version of Best Buy), I ran into an old acquaintance, from several years ago at another conference. We talked a little bit; and, he mentioned to me about the University of Burgundy and how their computer science faculty is open-minded. Perhaps, they would like some people from the industry making presentations over newest technologies. Since my most recent projects involved building extensions to cloud computing, it was likely that they could benefit from having my lectures/presentations on that topic. Additionally, I lived 35 minutes, by bus ride, from the campus.
I came home and wrote the head of the department a nice letter, who passed my contact information onto Christopher, their leading professor on semantic web and contextual search technologies. Now, I have formed a good contact with the French Academia.
Going back to the other thread, to my surprise, Quinton, Salesforce’s chief Developer Evangelist, wrote me back. With a few e-mails exchanges, he asked me to build a new adapter on top of the ActiveSalesforce GEM. Appropriately, I named my new GEM to ‘asf-soap-adapter’ in remembrance of all the contribution by previous developers.
Even though I have played with RoR technologies for around three years, building an industrial strength adapter was still a new experience for me. As a community would spring up around it, many useful suggestions from users would profoundly affect its development.
I multi-tasked. In the morning, I had my French studies. After lunch, I went back to my apartment; either worked on my presentation at the university and/or developed the ‘asf-soap-adapter’. As there were 9 hours time difference between France and California, So, I would often work long hours, until 1 or 2 a.m., answering e-mails and having conference calls.
In a matter of two weeks, I released the first GEM version, 1.0.1, at the end of September. It was my bundle of joy! Because I had a really difficult start-up curve with scanty documentation in the old GEM, this time, I used YARD tool and wrote extensive docs for my GEM.
T Plus Three – More Hacking
Next, Salesforce requested me to build a demo application for the DreamForce 2010 conference. Based on the conversation, I knew Marc Benioff would show this application for his KeyNote presentation. Unknown to me, the GEM would be the core technology for connecting RubyOnRails platform with Salesforce’s Database.com announcement, cementing the $212 million Heroku acquisition.
For the next three weeks, we had a sprint plan for developing the demo application (http://dbzillademo.heroku.com/) ‘dbzillademo’ using my adapter. The data came from ShopZilla / BizRate. At the second conference, they told me that they had uploaded a whooping 15 gigabytes. Now, we have more technical challenges ahead. First, the Salesforce SOAP API has a hard-cap, retrieval limits to maximum 2,000 rows per HTTP request. Furthermore, Salesforce SOAP API did not offer an offset feature, i.e. you cannot randomly jump from page 1 to page 25. If your query returned >50,000 rows, it meant having to use the “queryLocator” for ~25 times. At 25 x 2.5 seconds per call, it meant that the app would be locked up for 62.5 seconds. The second challenge was restricted by Heroku’s environment. Calls > 30 seconds response time would cause a generic time-out page being shown. While that made sense for standard applications, it was a real point for us finding solutions, enhance improving the adapter (ruby platform).
For the next week, Quinton and I worked on the solution for this problem. After some hacking, we refined our SOQL query to produce response time to less than 20 seconds. (I give much kudos to him, as he knows the Salesforce platform inside and out. Without him, the platform & demo would not have progress so rapidly.)
Still, I would like to see the response time be less than 2 seconds. Thus, I experimented with MemCache, using server memory to minimize remote calls. The knowledge gained from developing the demo has been incorporated into developing a better REST adapter. In the ‘asf-rest-adapter’, ‘memcache’ client has become a prerequisite.
~ 30,000 people attended December’s DreamForce 2010 conference. It was an eye-widening experience. I saw many cool technologies. Many groups and companies brought out their best products and ingenious solutions. It was a great learning experience. My demo, along with others, was considered as important topics and was exhibited in the Moscone Center West’s Developer Pavilion.
VMForce, a system for building Java web application using Spring Technology and Force data, was announced much earlier in March 2010. They have set a very high bar for us, given their history and experience. There are many areas where that combo hits the sweet spot. But, our technology, RoR + Force.com platform, held up very well, particularly it was incepted and developed much later.
In the future, I would like to work with our peers to incorporate and co-develop the “Cloud-surfin” concept: community data model, many variants of programming languages & deployment platforms.
I am a firm believer that we are at a junction in the evolution of technologies. Whereas the old model spawned a wealth of individual browser-based web-apps, collaborative development and community effort will be keys for success for the future, e.g. reusable objects, classes, and the community data schemas. Borrowing from the “Aspect-Oriented Programming” concept, the “cross-cutting” technique could be also applied to the community data model / database design.
As a student of human languages, French has a flexible position in the use of adjectives, placed either before or after the noun, which it decorates. Very obvious things for users are an application’s feature-sets and behaviors, e.g. the response time, the accuracy of search, etc. Where the data lives, whether it is in Oracle, MySQL, or in a cloud-based data source, should be delegated to the best performing system.
Over the course of this project, I learned a lot. Here are particularly important tips. Test-Driven Development helps to reduce defects. Breaking large jobs into small chunks and rapid/frequent deployment (Running LEAN) ensures the project stay on course. User feedback is your best source of information. Yes, a community is the real genius, even it were only a group of three people.