Earlier this year Google began including AMP listings into its mobile search results. AMP, short for Accelerated Mobile Pages, is a specification for creating slimmed down pages for mobile devices. It’s a subset of standard HTML but with restrictions. In addition, Google caches AMPs on its own CDN to provide the fastest retrieval possible. Any AMPs appearing in Google Search results are linked to these cached pages.
You may be asking yourself why we need such a thing when mobile phones are already capable of displaying “ordinary” websites. It’s a good question, and admittedly that was my first question when I first learned of AMP. The AMP project says the purpose of AMP is to give mobile users a better, faster web experience. But what’s wrong with the current user experience on mobile phones? Is it really bad enough to warrant an entire new web page specification when we already have HTML 5?
phpMyAdmin is a handy tool for administering mySQL from an easy to use web interface. However, leaving such a powerful tool open to the entire world can be downright dangerous and is something that should be avoided if possible. Ultimately it’s best to keep it off your production server (or any other server you care about). However, if you absolutely must use phpMyAdmin, you should restrict who can access it. Below is a quick and easy tweak that will only allow access to it from a specific IP address. This tweak assumes an Ubuntu LAMP stack, but should work fine on any Linux distribution, although paths may be different.
Recently, I needed to have a few specific categories show the products in grid mode, while all the other categories show products in list mode. This should be straightforward since the products that appear on a Magento category page can appear in either a vertical list, or in a grid. And, depending on how Magento is configured, the user can choose which view to display the products in.
AWS makes it easy to take snapshots of your EBS volumes. However, if you have many volumes, a way to automate and rotate snapshots becomes essential. There are many solutions out there to handle automated snapshots. One such excellent solution is the ec2-automate-backup script. Setting this script on a cron job, you can snapshot all your volumes in a specific region. Below is how I’ve set this up.
I have multiple EBS volumes attached to multiple EC2 instances. I needed a way to take a daily snapshot of all volumes. In addition, I needed the snapshots to rotate, such that only the last 7 days worth of snapshots would be kept.
I chose to create a small EC2 instance specifically for running ec2-automate-backup from a cron job that will backup the volumes of all my production instances.
WordPress can make for a great CMS, but sometimes it’s not always feasible or practical to use it for your entire site. One common scenario is to use WordPress for a blog that runs along side a site built with a completely different technology. However, a problem arises in this situation: What if you want to display blog posts on your non-WordPress site?
The present trend is that of mobility and the same applies to computers as well and that is the reason why more and more people are opting for internet enabled smartphones as they help them stay connected even while on the move. However, this has posed a certain type of problem for websites and website developers as many of them are not mobile friendly and hence not accessible through the mobile. The inability of websites to connect with the customers through mobiles could mean a loss of customers, which no business can afford. Therefore, website owners and developers have to work towards creation of websites that are available for the mobiles as well.
However, the task of creating or building a mobile version of the website is not very difficult as there are several tools that ease the process of creating mobile versions of websites. Some of these tools are discussed below:
On the Ecommerce Outtakes blog, we talk a lot about what not to do online. In fact, our main focus is to point out where websites go wrong—with the intent, of course, to help improve the e-commerce experience across the web. One trend we’ve been noticing a lot lately is a lack of good filtering and sorting options. It’s a widespread e-commerce epidemic, and it’s high time we cured it.
Page loading time is crucial to keeping visitors on your site and
maximizing conversions. Studies have been done that show the maximum
time people are willing to wait for a page to load is less
than 5 seconds. Make them wait more than that, and it’s game over.
They’ll hit their back button, never to return. It’s vitally important,
then, to make sure your web site is loading as fast as possible.
Sure, having a super-beefy server helps, but one important aspect of
having a fast loading website is reducing the size and number of your
page assets as much as possible. This can be a real challenge within the
current state of the Web. Web pages are becoming increasingly more
complex globs of code that require a huge amount of assets to display
and function properly. In addition to the plain old html, a ton of
background for the page to fully render in a browser.
“What’s the big deal?” I hear you ask. “All my visitors are on
broadband, and the js/css/images are only a few KB extra – hardly a drop
in the bucket!” Now, this may very well be true. However, the fact is
that the actual size of your files are only a small part of the overall
cost incurred on a page load. There is a much more subtle bottleneck
that has nothing to do with file size: The maximum concurrent connection limit.
This is a limit the browser enforces which dictates how many
connections can be open simultaneously to a single server. Even if
you’re on a super-fast connection, your browser will still limit the
maximum number of files you can download at one time. This number varies
from browser to browser, and may change slightly depending on connection
speed and web server configuration. The actual values for Internet
Explorer, Firefox, and Chrome are below:
It can be helpful to think of a concurrent concurrent limit as the end
of a funnel that your page assets pour through. Naturally, the more
assets you have the longer it takes for them to get through the funnel.
Making your assets smaller helps them pour through faster, but still,
only so many can go through at once no matter how small they are. The
key is to combine them into as few files as possible, thereby reducing
the connection limit bottleneck. Making your files smaller AND combining
them is a win-win situation. Smaller files + fewer connections =
faster loading site.
To “minify” a file simply means to strip out all the “human readable”
parts of the file such as indentation, line breaks, comments,
extraneous whitespace, long variable names, etc. E.g all the stuff that
makes it easy for a human to read, but which a computer couldn’t care
“Same Thing” Sickness
One thing that SEO’s hate about web developers has to do with the way they execute or fail to carry out a very specific request.
A case in point: An SEO requests a developer to create a 301 redirect between pages. The developer does a meta-redirect or a 302 redirect citing that it’s the “same thing”.
From the developer’s perspective, it’s the same thing for the user, but from an SEO standpoint, it affects the search engine rankings.
The Death of Optimization
Developer skills and SEO techniques go hand in hand, so when if a developer fails to do their job, then it doesn’t matter what the SEO team does. Even with a copy of Google’s secret algorithm in hand, the site won’t rank if the site won’t work.
A case in point: A client implements some redesign elements. Suddenly, traffic drops by 30%.
The problem: Many of the pages don’t load like they should and the ones that do load show 500 server errors. The developer failed to spot the errors during the development process.
The result: 3 weeks of seriously diminished traffic.
The “I Know SEO” Syndrome
This is a contagious disease that developers get that can quickly spread to other developers. If you have ever heard a developer say something like I’m pretty good at SEO, it can usually be translated into I’ve read a little about SEO and therefore I pretty much know more than you do.
But wait a minute, SEO’s. You aren’t immune, either. There is a related syndrome called “I can code”.
A case in point: An SEO expert successfully builds a Word Press site and suddenly deems themselves a web developer.
The Real Problem
At the root of the culture clash between coders and SEO’s are their driving philosophies. Business classes that teach search engine optimization focus on uniqueness. After all, differentiating yourself from the competition is a good thing. On the other hand, computer science classes center on making everything the same. Each discipline takes a different approach to reaching the same result: stability and efficiency.
Like any other art form, web design is completely subjective. A web site might look like a thing of beauty to one person, and a complete mess to another person. There is, after all, no accounting for taste, and everyone’s tastes are different. However, there’s more to a web site’s design than merely its appearance. A web site design can have an enormous impact on conversions and even the most subtle design decisions can have a big effect. For example, a user might be more inclined to click on a green “Buy Now!” button more so than a red one. Finding a good balance between a site that looks good and a site that performs well in terms of conversions can be a real challenge.
How then can something as subjective as web design be analyzed in an objective manner to find the most effective design? One widely technique is A/B testing. In a nut shell, A/B testing sets up two or more control groups: Group A will see one version of the site, while Group B will see another version. This way various design elements can be tested and compared.
But is A/B testing really the best way to determine the most effective web design? Perhaps not. This excellent blog post by Steve Hanov suggests another method for finding the best design. Best of all its fully automated. Set it, forget it, and the page will “learn” which elements result in the most conversions.
In his post, Steve outlines the epsilon-greedy algorithm, also known as the multi-arm banded problem. Given a set of variations for a particular page element, the algorithm can make an ‘educated’ decision on which element to show based on its past performance. The best performing page elements are displayed the most frequently.
The algorithm records of the number of times a particular page element was displayed, and the number of times the element resulted in a conversion. However, the algorithm will also adapt to change. If a page element’s conversions begin to decrease, the algorithm will start to adapt and display different variations. The best part of this is that you can set up different variations of page elements one time, and let the computer do the work of figuring out which variations are the most successful. Pretty neat stuff!
Armed with this knowledge, I set out to try a few experiments with it, the result of which is Robo_AB_Tester, a small PHP class library I created which implements the epsilon-greedy algorithm. You can give it a try here.
Robo_AB_Tester tries to abstract away as many implementation details as possible and create a simple interface that is, hopefully, easy to integrate into a PHP based website. Once it is set up, it will:
For more details, see the demo page.