The Server-Side Pad

by Fabien Tiburce, Best practices and personal experiences with enterprise software

Archive for January 2009

The Case for Nearshoring to Canada

with 2 comments

Recent accounting scandals and political instability in India have exposed offshoring as an uncertain and risky proposition.   There are cost savings to offshoring but is there value?  Let’s see.  Offshoring is by no means convenient.  Coordinating efforts with individuals in different time zones takes much planning.  It can affect an organization’s productivity and a product’s time to market as simple changes turn into costly iterations coupled with high management overheads.  Communication requires extra work too as the person on the other end may not have a sufficient grasp of English, Spanish or French (whatever is the prevailing language of your organization and jurisdiction) to communicate effectively.   Quality is harder to control and issues often take longer to fix.  Lastly, let’s not forget that the focus on customer satisfaction is very much a cultural phenomenon which may be blatantly absent in some parts of the world. In other words, it is unreasonable to expect offshoring to be like working with local contractors albeit cheaper.  It is nothing like working with local contractors, will definitely affect your operations, lengthen your time to market and may increase your risks.  All things considered, it may or may not be cheaper.  

None of this is new of course.  In fact many US companies realize the risks of offshoring and already nearshore to Canada.  The value proposition for Canada is strong.  Canada has political stability, infrastructure and an educated labor force.  It is aligned with the US time zones, has similar work ethics and business values (customer satisfaction, attention to details, etc…).   Canada has over 80 ethnic communities (lots of languages spoken) in Toronto alone.  Lastly with generally lower wages and a strong US dollars (1 US = 1.25 CDN at the time of writing), the risks are low and the advantages real. 

A decision to offshore should be based on overall value, not up-front cost savings. Canada’s value proposition is strong.

Also read follow up to this article at: https://betterdot.wordpress.com/2009/04/01/nearshoring-to-canada-your-comments/

Advertisements

Written by Compliantia

January 21, 2009 at 10:10 pm

Case Study: Serving Video Using a Content Delivery Network

with 3 comments

When all you have is a hammer, everything looks like a nail.  Nowhere is this truer than in systems.  While application servers can host static content (images, flash videos, sound clips, etc…. ), they are not optimized for this purpose.  Ignoring this fact will typically affect the performance of the application and ultimately the user experience.  The organization may also be penalized with higher labour costs, slower time to market and higher hosting fees.  This article discusses content-hosting alternatives.  

Years ago, few choices existed.  Organizations purchased load-balancers and servers to host their static content.  Each dynamic page that needed a static resource would fetch it from a remote location, through the load balancer, typically via an API.  There is nothing wrong with doing that.  In fact, if your organization has little static content and/or untapped server capacity and bandwidth, this is still a good option.  But before you run out and purchase hardware, consider outsourcing the problem.  Large news organizations typically don’t host their own content, perhaps neither should you.

While cloud computing is one obvious alternative, cloud computing is not, in and out of itself, load balanced.  You still need to purchase computing instances, setup a software load balancer on one, and web servers on the others.  While this is typically cheaper than procuring hardware, you still have to setup and maintain software services, albeit on a cloud instance.  Cloud computing also doesn’t necessarily let you reap the benefits of a geographically-distributed infrastructure.  This is where Content Delivery Networks (CDN) come in.  A CDN (such as Limelight which we used to help deploy a large video-based training system) is a network of servers that transparently serve content from hundreds or thousands of servers.  The CDN will deliver content based on the user’s location by choosing the closest server thus improving performance.  A CDN will also replicate changes automatically across the network so when new or changed content is  uploaded, all servers are notified.  Hosting your organization’s content on a public network doesn’t have to mean it is visible to all.  We implemented Limelight’s MD5-encoded secret key to fetch our content securely.  Alternatively, content can be automatically set to expire after a set period of time (useful for paid downloads).

A CDN not only improves system performance, it requires no setup and no maintenance thus easing labour costs.  It can prove cheaper to host and serve a large library of static content such as video.  It is highly redundant and can be made secured.

Written by Compliantia

January 13, 2009 at 6:14 pm

Case Study: Narrow-casting Using a Business Rules Engine

leave a comment »

The information age has allowed unprecedented access to information.  However simply broadcasting all available information to all users of a system, regardless of responsibility, department, geographic location or language, is counter-productive.  To engage the user, the information must be relevant. Narrow casting allows an organization to disseminate content to a narrow audience.  The following case-study discusses how Betterdot used a business rules engine to implement a corporate narrow casting application.

Imagine a page consisting of four quadrants.  Each quadrant is self-contained and hosts a widget or control.  Each widget may display an instructional video, operational tips or information concerning an upcoming promotion.  In this model, the view (presentation layer) is simply a template that binds widgets to the presentation layer.  The template does not know which widgets will be displayed, simply that a widget may be placed in each quadrant.  In this sense, the template is merely a wire frame.

The application server simply instantiates (enables) four  empty quadrants and passes these empty quadrants along with a reference to the logged-in user to the business rule engine for processing.  When done, the application server does little more.  It simply passes the request along to the view for display purposes.  There is no logic nor programming on the back-end, nor is there logic on the front-end.  

The business rules engine allows complex rules to be declaratively written for each user and each quadrant.  A rule might state that a user with a certain responsibility or geographic location sees an instructional video while another user sees information about an upcoming promotion.  The rules can be arbitrarily complex, nested and/or recursive.  More importantly, the rules are soft-coded and can be extracted from a text file or Excel document.  The rules are not bundled with the system, they qualify its output.

Business rule engines are powerful allies in an enterprise stack.  They allow the rules that govern the operations of a business to be captured efficiently by technical and non-technical resources.  They allow rules to be changed quickly and cheaply (no software development cycles, no compilation, no deployment) .   They allow rules to be reused accross systems.

We used the Drools rules engine for the above implementation, a great tool with an unfortunate name.  Drools was developed under open source by JBoss.

Written by Compliantia

January 13, 2009 at 5:26 pm

Moodle Learning Management System

leave a comment »

A Learning Management System or LMS is software that hosts and plays courseware.  As with other industries, the packaging of courseware has become standardized over the last few years.  In the learning industry, this standard is called SCORM and is managed by ADL, a US government agency.  Companies purchase SCORM-compliant courseware from various vendors, knowing all courses compliant with the standard will play in the company’s LMS.  Thus LMS’s have become commodities.  Few organizations build them.  Buy one then?  Sure and there is no shortage of vendors.  There are however very competitive offerings emerging from the open source community, the most noteworthy may be Moodle.  Moodle is a full featured LMS, supporting both SCORM 1.2 and SCORM 2004.  It has a user and course management facility, canned reports and a host of other features.  Built entirely in PHP (with no C++ back-end), it seems to lack a caching mechanism.  This could be a deterrent for a large implementation although missing features could easily be added.  The product is very well documented (something that some open source projects lack) and was a breeze to install and configure.   Moodle can easily be customized and skinned to fit the requirements of a particular organization.  The product runs on Apache and MySQL although we had it running on MS SQL Server is a few hours.  Organizations are increasingly turning to online delivery as a means of reducing costs while training their employees.  Features and costs make Moodle a very compelling offering.

Written by Compliantia

January 10, 2009 at 2:37 pm

Posted in Information Technology Posts

Tagged with ,

Analysis, Analysis, Analysis

leave a comment »

I am often reminded of the old real-estate adage regarding the three factors that matter most in real estate: location, location, location.   Likewise, while credit is often given to software development and integration efforts (and the teams/individuals behing them), every succesful application and every on-time deployment is rooted in three factors: analysis, analysis, analysis.  Analysis determines who will use the system, what the system will do, what it won’t do,  how it will integrate with other systems, other functional processes and other departments.  Analysis uncovers invalid assumptions,  unforeseen dependencies and potential risks and road blocks.  As such analysis allows a precise functional and technical blueprint to be produced.  This blueprint (technical specifications, typically a set of use cases, UML diagrams, class diagrams and database model) means the systems fits the business and functional requirements.  It allows the system to be designed to meet the requirements of the business which translates into reduced development cycles, fewer change requests and fewer bugs.  

Analysis not only pays for itself, it significantly reduces the risks, costs and times associated with developing and integrating software.  It can be done in hours, days, weeks or even months.  But even with rapid prototyping and agile programming, skipping a formal analysis step often proves costly and risky.  For all intents and purposes, the three most important aspects of software development and integration might as well be analysis, analysis, analysis.

Written by Compliantia

January 1, 2009 at 5:06 pm