Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Tools' Category

Twitter has been here for quite a while, but it has really grown in the recent two years. And lately, with the people tweet to share their ideas and opinions as well as notify their friends (aka followers) about the current news, almost every business is using Twitter constantly for promotion offers and other valuable information.

The competition in the online world is pretty severe, and Twitter is no exception. The users need to become more and more creative to make you follow them. Another important task is preserving the follower’s base. And one the ways to achieve that is to let them share an opinion with the “big brother” – the brand. That’s what Twitter Chat is about – allow followers (aka customers) to ask questions and offer personal opinions. Although it is doubtful that any of these ideas will be actually heard in the company HQ (although some certainly might), the “round-table” definitely present a certain level of satisfaction for the clients.

Additionally, this emphasizes a very important thing in the internet era – there are actual people behind the brand. @username is not a tweeting robot; it is a living human being that can chat with you from time to time.

Google Webmaster Tools are a popular feature, servicing hundreds of thousands and maybe even millions of website owners and web analytics over the world. Everybody works with the provided numbers, without questioning the reliability of the data. However, ask a mechanical engineer – and he will tell you that everything in this world has a “tolerance” and no value is absolutely accurate.

The same mechanical engineer will tell you that usual general acceptable tolerance is about 1%, so when certain parts should be 10 inch, it can easily measure 10.1 inch or 9.99 inch – and will still pass the Quality Assurance test.

It seems, Google is not yet ready to go into Product Design. Reportedly, the numbers offered by Google Webmaster Tools are only 90% accurate. According to Google’s Asaph Zemach, who was asked about number of impressions staying constant for three months time: “…when you see 24,900,000 you should really think 25M+/- 2.5M…”

Thus, if you think that your website is ahead of competitors, with 50 million impressions compared to their 48 million, they might be already ahead of you, as your actual figure might be as low as 45 million and theirs – as high as 52 million… As usual, statistics can be deceiving – and Google’s +/-10% just makes it more so…

We have been thinking about how to stop the spam bots without using a captcha. Most Captcha’s work to a certain degree, but in general you do not want to make it any more inconvenient for real people get through your web forms. Well Bob nailed it, just add a css class to the form field that says display:none, robots will fill out this field and real people will not.

So kill those super annoying captchas….personally I cannot read half of them anyway.

The marketing has always been about customer’s psychology – learning the specific needs of a potential client and trying to satisfy them. Google has demonstrated once again that targeting certain audience is mandatory. And if you are into SEO, you should be aware of it.

According to one recently published study, Google Search results on a smartphone will vary by over 80% from those produced by a desktop computer query. If you think about it – this is actually quite logical to have a slightly different search algorithm for smartphones. For example, smartphone users like downloading various applications. Thus the mobile Google Search presents many results that include the word “app” or “download”. Brand filters and store filters cannot be applied to mobile Google search and it is even more biased towards “local” domains, with Google places usually appearing higher in the vertical results list.

With the smartphone market growing quickly, it seems as only a matter of time when “SEO for mobile” will become a separate branch in the industry.

According to several anonymous sources, Googel is planning to launch an e-newsstand application that will run on Android tablets and smartphones in order to battle Apple’s iTunes sales.

Leading publishers, such as Time Inc and Wall Street Journal, are being approached by the not-only-biggest-Search-Engine executives in order to establish a fruitful cooperation and figure out the most beneficial way of application development. Commissions lower than those of Apple’s (30 percent) are being promised as well as various advanced buyer’s data-gathering features.

When all this is going to happen is unclear, and some doubt that the venture will launch at all – but the initiative is too important to overlook as Google looks to enter another niche.

Our world, especially the online part of it, is all about speed. That’s why Google had become so popular so quickly. The amount of search results shown and the speed at which those result have been accumulated and presented to the user did the trick.

Nowadays, of course, many search engines are almost as quick as Google, the difference in speed of producing the search results measured in milliseconds. However, what the users now want from their SE is convenience and reliability. With several questions asked about Google supposedly “biased” results, the Search Engine giant’s UI had never been an issue. That is, until now. The revised Google Image Search has a problem that can reduce the speed of entering a query.

Normally, when you type the query term in the search box and get the results, you do not use your mouse. Continue typing for “narrowing” the search, or press Shift-Backspace to erase the previous entry and start over. So far so good. However, when you use Google Image Search, the query box seems to lose the mouse focus once the pointer is moved – deliberately, or accidentally.  So, when you decide to alternate your search, you have to use the mouse (one of the worst nightmares of the speedy typer) in order to activate the query box again.

Google have reported they are aware of the problem and the “frustration” and “looking for ways” to solve it.

With Goggle introducing its new tool, the Google Books Ngram Viewer several days ago, many were enthusiastic about this being an ultimate feature to use in etymological research. After all, the Ngram Viewer allowed to search millions of books (Google books, of course) and then check, track, and analyze the appearances of any word throughout many centuries.

The users were enthusiastic at first, but it turned out that the tools is far from perfect. According to recent review, there are many problems and inaccuracies in Ngram Viewer reports – both expected and unexpected. A very basic issue is the OCR – Optical Character Recognition. Even for modern books and fonts, there are occasional mistakes that occur, best OCR programs report just below 1% percent error margin for a text of recognized words. For books from the 16th and 17th centuries, with the artistic fonts this margin is sure to be higher. One example is the letter “s” confused with “f” on numerous occasions.

Another problem observed is that for the first occurrence, as Google Books NGram Viewer does not take into account the developing of language over time, thus you have to research several forms of the world used throughout the centuries to find the actual first usage. And also, there are the reprints. Many Google books are labeled with the year of their print, instead of the year of the original manuscript, making the search produce more hits for “recent” years.

Overall, Google Books Ngram Viewer is not bad. It is just not as reliable as one could think it is. Suitable for occasional queries, it cannot be considered as reliable tool in serious academic research.

Bing has introduced a new feature for the Bing Image Search. It is called “Instant Answer” and is, actually, a small bar of “tabs” that appears just below the search term box. Each tab suggests a more “specific” query, being especially helpful in the case of ambiguous search terms.

For example, when you type “heat”as the Bing Image search term, the “Instant Answer” will suggest narrowing your search to “Miami Heat”, “Heat Energy”, “Heat Wave” etc. The term “star” will feature “shortcut tabs” for filtering images of “Patrick Star”, “Star wars”, “Star of David” and more.

This feature resembles the infamous “related searches” suggestions for web search and is useful in saving our time, as we do not need to retype the whole query, using a simple click of the mouse instead.  “Instant Answer” is, however,  currently only available for the US Bing website.

Twitter LOVES twitter spam. Is there an official name for twitter spam? Twam (OK I will trademark that)?

Why do I say this? Because I found it shockingly difficult today to find a tool to Bulk Unfollow on Twitter. I have been in quite a dog fight today. In theory what I am trying to do should be simple. I would like to bulk remove spammers and defunct Twitter members that a client is following due to the use of one of those auto follow programs.

This Twitter account is currently following 20K members, and has about 20K people following them. I don’t want to close the account as they would lose many loyal followers; my guess is about 5K. So we need to remove all the junk. We started out by trying to remove them manually within the Twitter interface, 20 minutes removed about 200 and I quickly figured out this was not going to work. So I decided to look for some tools online that could assist me with the removal of most of the people currently being followed.

I found plenty of tools to remove Twitter followers, and of course tried the free ones first. With zero success on those, it’s possible that these free tools are not being properly maintained. Time to spend some money to get the right tool.  The first paid tool would not get past the API call to Twitter after 30 minutes.  I’m fast becoming frustrated, but luckily I find a nice blog post listing 23 tools. Here’s that list for your reference:

http://www.aboutonlinetips.com/mass-bulk-follow-unfollow-tools/

And here is a summary of what I found checking these out:

http://www.socialoomph.com/ did not try this one, as no where did it mention bulk removing of twitter members being followed.

  • Although at second look I noticed about ¾ the way down the page they do say If you find that you are following a large number of accounts that you don’t really want to follow, you can use our system to completely wipe out your friends list. When the process is done, you will be following nobody and will be able to hand-pick those accounts you want to follow. Due to the Twitter rules, this feature is not available on Twitter accounts. So maybe they can offer the start anew.

http://www.jdwashere.com/twiping/index.html Twiping looked really promising, quickly dropped $9 on this, downloaded the application. Except all it did was lock up my computer, multiple times, tried it on multiple twitter accounts. Really annoying.

http://www.theunfollowed.com/home.php Looks really easy, log into twitter, and unfollow in bulk, perfect.  But after logging into the Twitter account authenticate, it kicks back to http://www.theunfollowed.com/home.php, and when I click on bulk, nothing happens, locked up. Tried this in Firefox, Chrome and IE….same thing, just locks up.

http://www.tweemaid.com/ Tweemaid is another tool that claims to wipe the list clean, but 3 browsers and all I get is taken to a dead page at the end.

http://joeldrapper.com/untwollow/ Untwollow only removes 1 twitter following at a time (I think not confirmed its removal as I did not have time for one at a time)….yet another lost cause.

http://friendorfollow.com/ This tool seems to function, but does not do what I am looking to do. The tool just shows you who you’re following that is not following you, possibly useful, but will not work for this task.

http://www.huitter.com/ apparently does not exist any longer.

http://dossy.org/twitter/karma/ also looked promising, never a good sign when it says on the page, if it does not work in 5 minutes you should reload the page and try again….guess what that would be me, moving on.

http://www.mycleenr.com/ apparently out of business, but they were kind enough to recommend two other services. They recommend FriendorFollow which I tried with no success, and:

Refollow http://www.refollow.com/refollow/index.html The verdict is not out yet, I paid them $20 and am waiting 24 hours for a confirmation of payment? (Certainly not giving me the warm fuzzies, they may as well of sent something via the US post office).

http://www.buzzom.com/ Does not even mention on the website how to begin…

http://twitoria.com this looked good except it only appears to list out the users that have not been active on twitter in some time, does not appear to give an opportunity to get nuclear on them.

http://tweetblocker.com/ this website appears either down or gone…..I vote for gone.

Needless to say the list goes on, I tried probably 5-6 others I cannot even find the names of.

I did find two that work!

http://untweeps.com was fairly simply to log in, it apparently only permits you to unfollow people who have not tweeted in a specific time range, like in the last 30 days for instance, although if you sign up for  $2.00 you can remove just about everyone. If your patient with loading time you can remove about 700-1000 at a time (secret put 0 in the box for time since last tweet).

http://tweepi.com Also works but will only permit you to remove 20 (or 40 at a time if you tweet about how wonderful they are) at a time.

In order for these two services to work, you need to download one of the two Firefox checkbox plug ins:

https://addons.mozilla.org/en-US/firefox/addon/2393/ I tried this one with Firefox 3.6.13 and it would not work.

https://addons.mozilla.org/en-US/firefox/addon/9740/ this one specifically said it would not work with the latest browsers. So I downloaded Firefox 3.0 and this utility worked great.

For either tool, simply highlight the entire section you want to check, then right click and select check. All boxes checked, then click Remove.

In finally finding these two websites that actually appear to work, I discovered something interesting. Twitter is making bulk un following people VERY difficult. They actually state in their terms and conditions of API usage that a “Check All button was a violation of the TOS (terms of service)”.  In theory what Twitter is saying is it is very interested in keeping up inflated numbers of members, followers and activity, to such a degree that bulk no following is basically becoming impossible.

I suppose that Twitter in a half hearted way likes Twitter spammers, auto bots and deadbeat users, because if they really wanted to eliminate these users, offering the opportunity to bulk remove people would not be such a mighty task.

I could understand if Twitter took a very hard line approach to all API applications, but many of the auto-follow applications work flawlessly, but seems bulk removal of following people is discouraged.

I hope my 4 hours of time can save you time if you find yourself needing to find a Twitter Bulk Un Follow or Removal tool.

Using page speed as a primary factor in the search-engine process will only affect a tiny percentage of sites, according to information from Google. The company provides a number of ways to speed up sites. Webmasters should probably take look at these. code.google.com

apache mod_pagespeedThe key is using filters that drive down to best performance practices on the pages. The module includes filters to optimize JavaScript, HTML and CSS style sheets, along with filters to optimize JPEG images and PNG images.

When word first came out about mod-pagespeed there was a tendency to panic. Developers and Webmasters found that in addition to the dozens of factors affecting search-engine rankings Google was going to start using speed as a primary factor. The questions were:

  • Would this have a serious negative effect on the “little guys” who couldn’t afford to fully optimize their self-designed pages?
  • How would complex but attractive pages be affected?

Here’s the bottom line on mod_pagespeed: Webmasters and developers will use this to improve performance of Web pages. But there is some specific information these developers will need to know. This is open-source Apache software used to automatically optimize pages and content served with the Apache HTTP server.

Using mod_pagespeed in combination with the correct compression and caching steps should result in significant improvement in loading time.

Basic Steps

LinuxBefore committing to the use of mod_pagespeed be sure you are working with Apache 2.2 since there aren’t any plans to support earlier versions. If you’re up to it you can develop a patch for these early versions.

According to the best information available, mod_pagespeed can be downloaded as binary code for i386 and x86-64 bit systems through svn. Google instructions add: “It is tested with two flavors of Linux: CentOS and Ubuntu. The developer may try to use them with other Debian-based and RPM-based Linux distributions.”

Several filters have already been mentioned but Google puts special emphasis on “exciting experimental features such as CSS outlining.” This enhances the ability to draw around some page elements to help them stand out to the viewer. Developers and Webmasters can set outline color, width and style.

Achieving optimum page speed is the goal but Webmasters have to take compression, caching and order of download into consideration. It’s also important to reduce the number of trips back to earlier pages and to cut down the number of round trips from page to page.

As you learn more about this significant change in the function of search engines and the Web you may want to understand Page Speed, an open-source add-on for Firefox/Firebug. This is used to evaluate page performance and find ways to improve results.

This is an extension to Mozilla Firefox that runs in the Web development package. Running a performance analysis on a page brings the user a set of suggestions and rules to follow for improvement. Page Speed measures page-load time so that it presents the issue from the user’s viewpoint.

Rich Get Richer?

speedup with mod_pagespeedSo, with this additional information about using page speed as a search-engine factor, should we all start to worry about the “big boys” crushing the smaller sites? As we mentioned in brief earlier, this was a cause for concern among Webmasters, developers and site owners. But a blog from Matt Cutts in April went a long way toward reducing the stress of this announcement.

He literally thinks this is not that big a deal. His reasons include:

  • Site relevance will still be paramount
  • Reputation and content quality will still be primary factors
  • Less than 1 percent of Web queries will change after the inclusion of site speed as a factor
  • Most people didn’t even notice when Google launched the “speed” factor

There are a lot of sources for learning about site speed, mod_pagespeed and other factors in this process. Google has devoted a section of their Web presence to the issue. See code.google.com As for the problems presented to small sites, Cutts and others believe that smaller sites will be able to react more quickly and would actually feel fewer negative effects.

Keep in mind that when we took a brief look at Page Speed above (the open-source tool) we mentioned that the emphasis is on load time. That is the amount of time that passes between a user request and the time when they are able to see the full page, with all graphics, images and text.

Some larger sites with complex designs may suffer if they can’t figure out a way to speed up their load time. Part of the answer may be in the move to larger hosting companies that can afford to put lightning-quick servers into operation.

But there’s still another issue with using page speed. The tendency among quality Web companies is to analyze every detail of operation. Google Analytics can slow down load time as it tries to gather significant data. How will this be folded into overall operation of a site?

To Sum Up

Slow page-loading leads to loss of viewers. Studies have shown this to be true. Even if a site is very popular, users will drift away because of slow response times. It may take a week or two for them to build up the courage to come back. At this point it’s best to take a few basic steps such as reducing download size if possible, improving layout and minimizing round trips between pages.

We don’t need to panic about mod_pagespeed but then we probably shouldn’t be shouting from the rooftops either. The best path is somewhere between these two extremes.