2013-10-07

How do you deploy multiple versions of the same portlet in Liferay?

While developing a portal site, it can be very helpful to support deploying multiple versions of the same portlet.  Some of the reasons that our development team have encountered include:

  • Helping to debug issues that are introduced in new versions
  • Comparing functionality and performance between two portlets, while keeping everything else equal
  • Comparing functionality and performance between two portlets, while keeping everything else equal
  • When a portlet is used multiple times on a single site, it can be advantageous to use multiple versions of the same portlet so that all dependent portlets don't have to be updated when new features are added to the portlet that is used multiple times
  • In Liferay, JAR files are not cleaned up when a portlet is redeployed.  Therefore, if the JAR files a portlet uses are updated and it is redeployed, both the original JAR files and new JAR files will be in the lib directory.  This can cause issues if the new JAR files contain different versions of classes.
    • We have run into this situation a couple of times, and it leads to confusing and unexpected results
To properly version a portlet, you need to do two things which we achieved by adding version numbers to both:
  1. Make the directory the war file is deployed to unique so that Liferay treats them as separate portlets (in the webapps directory)
  2. Make the name of the portlet that shows up in Liferay's Add menu unique so that you can control the version of the portlet that is added to a page, and later on determine which version of the portlet is on each page
Controlling webapps directory

Through experimentation, I found that the webapps directory is based on the WAR filename.  The exact WAR filename is used as the name of the webapps directory, except when the WAR filename contains the string "-portlet".  When the WAR filename contains specific character sequences, everything after them is ignored.  The character sequences that I know about are: -portlet; -hook; -ext.  Here are a couple of examples:

WAR Filename
webapps directory
calendar_1.0.3.1.war
calendar_1.0.3.1/
calendar_1_0_4_0.war
calendar_1_0_4_0/
myportlet-portletAA.war
myportlet-portlet/
crazystuff-ext.war
crazystuff-ext/
myaccount-hook12-production.war
myaccount-hook/

Controlling the portlet name (in Liferay)

The first thing that needs to be done is to make the portlet ID unique so that Liferay can track it.  The portlet id is in liferay-display.xml.  I simply concatenate the version number onto the end of the portlet ID.

Next is to update the portlet name.  The name must be updated and kept consistent in portlet.xml and liferay-portlet.xml.  To be consistent, I simply concatenate the version number to the end of the portlet name.

One thing to keep in mind is that there is a known bug in Liferay, where an exception is thrown if the portlet name has a hyphen ('-') in it.  We also found issues with periods ('.') in the name and in the WAR filename.  So what we do is avoid using either of these characters (as well as spaces), and replace them with underscores ('_').

Maven


In our case, we are using Maven.  So to simplify portlet versioning, we keep the version number in the POM file and variables throughout the other files where necessary.  The variables are automatically replaced with the version number when the WAR file is generated.

In the POM file we setup the WAR filename to be ${pom.name}_${pom.version}.war to minimize the number of changes required when changing the name and version of the portlet or hook.  Note that Maven can't be used with ext plugins.

2013-08-05

Liferay - Multi-Stage Development and Data Management

Our development environments and data migration strategy is something that has evolved during the project, and will continue to evolve during development.  Process and design improvements are encouraged and investigated in an attempt to optimize the end product.

When multiple developers are involved in a large project, multi-staging is extremely important.  It allows developers to work simultaneously and independently, without affecting each other.  It also allows stability and loss testing along with a bunch of advantages. The number of stages required should depend on several things (this is obviously not a complete list):
  • Size of the project
  • Complexity
  • Number of developers
Two stages is the minimum required for any project, one for development and one for production.  However, it's highly recommended to have a Test environment between Development and Production.  The Test environment should be identical, or at least as close to the Production development as possible to minimize risk when deploying updates to Production.

In our situation we have three stages, which are:
  • Development
  • Test
  • Production
Furthermore, we are currently debating adding a fourth stage, QA, that would be always kept identical to the Production environment.

To meet our requirements, the Production environment is designed to be highly available with session replication.  Each stage incrementally becomes more similar to the Production environment.  The incremental changes spread out the issues that are due to environment variations, easing debugging.  Each environment is designed specifically for a purpose. 

Initial development is performed on a local desktop or laptop, which is not listed above.  The local environment is the most flexible, easily restarted and best for independent development and debugging.  Each developer has a complete environment running locally on their desktop/laptop.  This environment, which is used for rapid development, initial integration and testing by the developer, debugging, etc...  It gives developers the maximum freedom to work without affecting each other, which is very important near the start of the project because restarting services is quite common.  Locally we're developing using Liferay Developer Studio on Microsoft Windows.

We're using Development for initial integration between developers and testing.  It is running on RHEL.  The Test environment adds session replication, is located in a DMZ with public access and uses an Oracle database.  The differences in the development and great environment increase reliability and performance.  They also make the test environment extremely similar to the Production environment.

Data migration between environments quickly became important to synchronize
configurations and reduce overall effort.  We use built in features of Liferay to migrate documents, content and configurations.

To migrate web content as well as documents, we export and import LAR (Liferay ARchive) files. They're easy to use, however sometimes we notice issues with improper migration and have to repeat the export and/or import.  Issues we've encountered include missing content and permissions.

You can use staging to migrate pages and other configuration, however we we are using the database migration (Control Panel -> Server Administration -> Data Migration)  in Liferay. Data migration is a bit more work to use, but right now we aren't synchronizing individual pages -- which is the main advantage of staging -- we generally want to synchronize everything.

The headaches that go along with database migration are having to initialize the target database (shutting down the instance of Liferay that's using the database, and deleting all tables within the database) and then restarting both Liferay instances.  It is much more powerful though, allowing us to migrate from any environment to another. We find that it is extremely useful to migrate from Development or Test to our local environment during integration and testing.  It will also be important to migrate from production to test, development or locally when debugging and trying to recreate problems that occur in production.

To aid with data migration we switched from using the default hypersonic database to using MySQL in all the environments before test.  The Test and Production environments are using an Oracle database.  Using mysql was done for several reasons. 
  1. Allows database backups and quick restores because the database does sometimes become corrupt. We have nightly backups of our local and development databases.
  2. Allows database migration to and from any environment
  3. Allows direct access to the database for debugging and clearing lock records (Lock_ table)
Originally we were hosting Liferay on Glassfish, but we ran into several issues including session replication and data migration. Since then we have switched to using tomcat as our application server, and have not had any application server problems since.

So for now, we have a pretty complete development strategy with respect to staging, backups and data migration. We did not arrive at this state right away. Several of the decisions were made during development in an attempt to streamline processes, reduce effort, improve efficiency and end up with a more easily maintainable end product; something that our team is constantly working to achieve.

2013-06-08

RSS - Replacing Google Reader with ifttt and email

I used to be a Google Reader user, and used it religiously.  I used the web interface, and had NewsRob on my phone.  It was great.  Where ever I was, I had access to all my news feeds, and never missed an article.  I had my feeds nicely organized by subject, and kept on top of them.  Checking my feeds was a regular part of my day, quickly skimming titles for articles I was interested in and archiving items using other tools so that I could read them later when I had the time.

When I heard that Google Reader was getting axed I wasn't sure what I'd change to.  I tried a couple of alternatives that people posted (ex. Lifehacker's Five Best Google Reader Alternatives).  Specifically Feedly and NetVibes, but they weren't quite what I was looking for.  NetVibes is good, but I had periodic synchronization issues (entire RSS feeds would become unread) and it only syncs 100 items at a time, so it isn't that useful offline.

I continued to use NetVibes for a month or so, and then it clicked.  I was interacting with feeds a lot like email (and in fact Google started using the same UI for both Google Reader and Gmail).  So I figured why not just develop something that will convert RSS feed items into emails, then just use whatever email client I want to manage the articles.  Some of the advantage of managing feed items as emails is that:
  • Email addresses are free
  • Email is sticking around and commonly used
  • There are numerous email clients
  • A lot of people are working on improving email clients and managing emails
I had developing a daemon that read RSS feeds and generated emails based on them on my to do list for a while.  I knew it wouldn't be too difficult to do, because I had developed daemons that generated emails previously (I wrote a simple one in Python that checked the stock status of the Nexus 4 every 5 minutes and sent me an email when it changed).

The much easier solution, which I came to before wasting development work, was to use ifttt.  I already use it to forward specific G+ posts to Yammer, Twitter, etc... and it works great.

I created recipes for each RSS feed I follow.  I configured the recipes so that they're easy to manage
within ifttt and the emails are easily manageable.  Here are the steps I followed to get setup:
  • Sign up for a gmail account
  • Sign up for an ifttt account
  • Create a recipe for each RSS feed
    • Add RSS tag to recipe description (#RSS)
    • Add RSS feed name to recipe description
    • Trigger: RSS feed (need the URL)
    • Action: Send an email, prefix the subject with an acronym for the RSS feed
The prefixes in the subject allow you to quickly look through items and determine the feed they came from.  You can also use the prefix to sort/filter incoming emails and redirect them to folders (most email) or add the appropriate labels (gmail).



Archiving items for reading later or reference is very easy to do with emails as well, which is another nice benefit.

Some final notes:
  • You can use an existing email address.  If you do, I suggest adding something in the subject line of the email that allows you to create a custom filter to separate the feed items from normal emails.
  • Keep the prefixed used in the subject short so that it doesn't dominate the subject line space when looking at the emails in your email client (especially when using a small screen like your phone)