Go green PCs

Personal Computer have changed the people life. And every person have a PC in their home in a normal family. And my opinion every one need a PC. But due the the huge amount of use of the energy due to the Computer in huge sector, the energy consumed is growing day by day. And there must be some solution to this problem. If the huge computer can be made to work in low power available then we can managed the energy saving and bring a revolution in the world.

After some research we found that Low Power PC has been introduced by some people around the world and has been a great success. It has a saying that its energy costs reduced 60% of the current energy consumption. So a Low Power PC can be a great revolution for the people and countries.

Its not just about the power consumption, but they are also the cheapest found on market. If people are really concious about the energy and the money then, I suggest people surely should get a one. I have order a piece for myself, as every good things should be started from ownself.

Save energy, Save world – Go green.

Optimising on Facebook with the Open Graph Protocol

Social media optimisation, particularly on Facebook, is not something you should take lightly. We all know the power of word-of-mouth; well these days that means the power of sharing on social sites via wall posts and the like button.

Facebook users can easily share any content they like on their wall by simply uploading a photo, video, or just pasting a link. However, lots of people do not do this even if they visit a page they like simply because it doesn’t cross their mind to do so, or don’t want to take the extra step of copying the URL and pasting it on their wall.

A good way to encourage people to share your page on Facebook is simply by adding the Like button on your page and configuring it using Open Graph Protocol tags so that it will show more information once posted on Facebook. Sure, a simple link showing on someone’s post is better than nothing, but since the goal is to entice as many users as you can to click on that link it means that you have to optimise that link to do just that.

The great thing about Open Graph tags is that it will allow you to tell Facebook how you want your content’s link to be structured when shared by someone on their wall. You can specify the exact details you want to appear, from the title to an image and description.

As you can see below, the link I shared going to the Bailey’s fundraising page in Just Giving is a lot more enticing to click on than it would have been if it appeared as this a simple non-descriptive link. At the portion labelled number one, you can see how the Open Graph tags also put in important details including the page title, a short description, and an image on the left.

It is also evident that the payoff for optimising on Facebook is evident because fellow users seeing the link have four actionable courses, which are to click on the link itself, to Like the post (number 2 portion of the image), re-share it on their own wall (number 3 portion of the image), and to comment on the post.

Click on the link to learn more about the Open Graph protocol tags and how to use it to turn your pages into graph objects.

a

How to mention multiple XML sitemap in robots.txt?

A robots.txt file for a website helps to inform search engine bots whether to allow or disallow a file or a directory for search index

Typical robots.txt file will look like,

User-agent: *
Disallow: /includes/
Disallow: /scripts/
User-agent: * - Means this section applies to all robots
Disallow: - Means that the robot should not visit any pages in the mentioned folder.

If you have a single or multiple XML sitemap then do not miss the opportunity to add to the robots.txt file. Below is an example of how multiple xml sitemap can be added to robots.txt

User-agent: *
Sitemap: http://www.yoursite.com/sitemap1.xml
Sitemap: http://www.yoursite.com/sitemap2.xml
Sitemap: http://www.yoursite.com/sitemap3.xml
Disallow: /includes/
Disallow: /scripts/

Related Post: Everything about Meta Robots and robots.txt

Expecting Panda Roll Outs

Real life little panda cubs might be as cute as hell, but hell is also the right word to use in describing what most webmasters feel each time a new Panda of the Google kind is about to be birthed. After all, even if you have cleaned up your site as best as you can, like Will spencer claims to have done (check Aaron Wall’s post “Panda 2.5…and YouTube Wins Again” to see the conversation between Matt Cutts and Will Spencer), and can still get penalised for spammy content that was once part of your content, then it is no wonder that you’ll be getting butterflies in your stomach just thinking about Google Panda updates.

While anticipation can make you even more nervous, just shoving the issue under the rug won’t fo you any good come Panda update roll outs. The best thing to do in this case is to still clean up your site, strictly adhere to publisher guidelines, and to stay on the know as much as possible. At the very least knowing when Google’s Panda updates are about to be rolled out will give you time to recheck your site for malware attack (you never know!) and make last minute adjustments.

One way you can easily do this is by following Matt Cutts on Twitter, who by the way tweeted last week that more Panda updates are due in the following weeks (which means this starting this week!). So don’t be surprised if strange things start happening to your rankings, although he says that it should have less than 2% impact than previous updates.

Panda Update Alert from Matt Cutts Panda Update Weather Alert from Matt Cutts – Hope you don’t get wet!

 

 

 

a

How to Create a Custom 404 Page

Post image for How to Create a Custom 404 Page

If you’ve spent any time on the internet, chances are you’ve seen silly, funny, or just plain insane custom 404 error pages. However, this article isn’t about creating one of those types of custom 404 pages; this is about creating a better 404 page that is not only helpful to your users but solves technical problems you might not even know you had.

Make Your 404 Page Informative.

Technically a 404 is an “invisible” code that is sent by your webserver to the computer or search engine that requests your page. It tells them the page they were looking for isn’t there. There are lots of reasons why that could happen, such as maybe you moved the page or maybe the person made a mistake linking We’ll deal with those issues later on, but for now let’s focus on the user. Some people will know what a “404 error” is but others won’t, so my recommendation is to tell them it’s a 404 error but also include some information that tells them what happened and that the page they were trying to get to isn’t there.

That’s where a lot sites stop. In my opinion they drop the ball. If you leave a customer at a dead end, they will leave; if you give them an information scent to put them back on track, you have a better chance of keeping them on your website and doing what you want them to do. Every website is different but here are some suggestions of what kind of links/information to put on your 404 page:

  • Links to popular pages
  • Links to popular products
  • Links to important company, customer service, or contact information
  • Links to important FAQ’s
  • Links for technical support or contact

Getting Data from Your 404 Page

Now that you’ve made your 404 page useful for the users, how about making it useful for you and your IT team. One of the easiest ways to do this is to gather information every time the 404 page is visited and write it to log or database. If your website is in bad shape and has a lot of errors, this could actually backfire and become a huge resource hog, so be careful when you implement this solution. If possible, try to capture what page/file they were looking for, where they came from (referrer), IP address, and user agent. Once you have had this in place for a while, you want to start aggregating the data and looking for trends or patterns.

  • Are there files/page that are being requested often?
  • Are there websites that might have a bad link to you that you can ask them to fix?
  • Are you getting lots of 404′s from one IP? (this usually indicates scraping or bot behavior)

Making Your 404 Page Work Smarter

Any website that has been around for a few years has likely seen some architecture changes, had pages come and go, and maybe even changed serving technology from HTML to JSP to ASP to PHP. The more changes, the more likely you are to have 404 errors. You can play with your server technology to have JSP pages really run as PHP, but that can get tricky. You can use server tools like ISAPI or HTACCESS to handle these redirections but, if there are hundreds of them, there will likely be a negative performance impact.

A better solution is to use the 404 page to handle redirections. When the 404 page gets called, before a single line of code is served back, check the file requested in a redirect table. If there is a match, then issue a 301 redirect; if not, then serve the normal 404 page. It’s really important to not serve code back and then try to redirect: that creates what’s called a “soft 404″ and actually makes things worse. There are a few advantages to using your 404 page to handle redirects:

  • You can prevent chaining multiple 301 redirects together
  • Usually this method uses less resources than ISAPI or HTACCESS
  • You have one central point of maintenance
  • It’s a lot easier to migrate this solution if you change hosting companies or serving technologies

So what are the takeaways from this post:

  • Funny or entertaining 404 pages are good, but 404 pages that help the end user are better
  • Use your 404 to to give customers the information they need to get back on track to complete a sale, fill out a form, or other mission critical goals
  • Be helpful with customer service, FAQ, technical, or support information
  • Use your 404 page to gather information about problems on your website
  • Use this information to fix broken links and set up redirects

photo credit: Shutterstock/3445128471

tla starter kit

Related posts:

  1. How to Create an Amazing About Us Page About Us pages are one of the essential elements of...
  2. Thesis Tutorial – Custom Post Dates The following post is part of my Thesis tutorial series....
  3. WordPress SEO: How to Create Living URL’s If you watched any of the congressional hearings about the...
  4. Optimizing WordPress Page Titles, Post Titles and Page Slugs While wordpress is one of my favorite CMS platforms, out...
  5. Create Landing Pages for Expired Information I’ve discussed what to do when you have products that...

Advertisers:

  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Ezilon.com Regional Directory - Check to see if your website is listed!
  4. Need an SEO Audit for your website, look at my SEO Consulting Services
  5. Link Building- Backlink Build offers customized link building services
  6. Directory Journal - Get permanent deep links in a search engine friendly directory
  7. LinkWheel SEO - Get Web 2.0 Backlinks
  8. RevSEO High PR BackLinks- Private High PageRank Homepage Link Network
  9. The #1 ranking SEO software toolkit: get your free download
  10. TigerTech - Great Web Hosting service at a great price.
  11. Article-Writing-services.org - Article Writing Services creates quality content for websites and blogs at no cost to site owners.
  12. Rouper.com - Buy & Sell Premium Websites
  13. Blurbpoint - is reputable SEO link building service provider. Get quality one way links with our complete link building packages.

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis Wordpress Theme review.

How to Create a Custom 404 Page

Managing Your Sites at a Glance with Site Health

Managing websites can be really time-consuming no matter how efficient you are. Thankfully, Google has given us yet another way to make our lives just a tad bit easier by introducing “Site Health” on Webmaster Tools.

Site Health allows you to see all the websites you manage, and at-a-glance know which ones have problems that need attending to. This will obviously make your life easier because you no longer have to look at the details of each of your site needlessly, and just tend to the ones with problems right away. You can easily spot problematic sites since they will automatically be given the “place of honour” at the top of your Webmaster Tools home page, and because they’ll be showing a round red icon with an exclamation point in the middle right beside the link to your site. A click on icon will show you the list of site health checks and the status and details of each check.

Note though that having more than a hundred sites registered in your Webmasters account will mean that you won’t see this new feature on your homepage. So I guess that’ll mean you poor chaps who have too much work on your hands will still have just as much work to contend with. As for the majority of Webmasters, our life just got a little easier.

a

SerpIQ Interview

Post image for SerpIQ Interview

The following is a sponsored post for SerpIQ.

For today’s post we’re going to be talking to Pasha Stewart and Darrin Demchuk of SerpIQ, Hi Pasha and Darrin, for my readers who don’t know you can you tell us a little about yourselves and experience in the world of internet marketing.

Pasha: Thanks Michael, I’ve been doing internet marketing since 2004, SEO since 2005. Started out getting a crash course in buying google clicks from a friend, sold a ton of products but when I did the books, I realized that the $10,000 I just did that month in gross revenue didn’t look so good with a $6000 net and $4000 ad spend. Decided I was going to learn how the guys on the left hand (organic) of the front page did it. Started me down the long strange journey that is SEO. Figured some things out along the way, got a reputation as the guy to hire for “tough to rank” terms and now I split my SEO time between testing new link targets/structures, maintaining the rankings (and conversion testing, never ends) on my sites and fitting in some high end client work when time allows.

Darrin: Thanks for having us Michael. I’ve been professionally coding since I graduated high school and have spent the last 5 years doing client development work. I got involved with SEO (and specifically automation and SEO software development) about 3 years ago now and was taught by a great group of Internet Marketers so I learned a lot of the lesser known and discussed techniques about building and ranking sites. Pasha and I met through our mutual participation in a popular Internet Marketing forum and started working together privately together on internal tools. Our first publicly available collaborative project was a link building service that was powered by a custom built control panel by me that has done very well for us for about 18 months now.

Lets start out talking about client-side SEO, a lot of companies are starting to develop in-house SEO assets, do you think client SEO is going to grow or contract in the next few years?

Darrin: I think we’re really at the beginning of a big growth period for in-house SEO assets as both reputation management and traditional SEO becomes more and more prevalent. This growth of in-house SEO assets is actually a good thing for SEO consultants, as they’ll find clients both in those who aren’t able to hire their own SEOs and those who are in over their heads regarding their rankings and organic traffic. Just like with Interactive Agencies having both in house coding talent as well as outsourced consultants, we’re going to start seeing the same arrangement for SEOs and consultants.

Pasha: I think client side SEO will grow. I agree that the number of companies moving to an in-house SEO plan will also grow, but many companies will realize that they’re core competency is not in doing online marketing, it’s in whatever brings in cash flow. As a former brick and mortar owner, I’m all too familiar with how many hats the small to medium business owner has to wear to get through the day, adding in marketing, especially marketing that they have to learn on the fly, just isn’t a possibility for most situations. With the advent of mobile internet, it will only be more important for companies to establish solid online brands and positioning.

I know you worked in client-side SEO for a number of years what do you think are some of most important aspects of those type of business relationships.

Pasha: I think the most important element of client/SEO relationships is that the SEO has to provide a positive ROI on the client’s investment. The ROI should be of a monetary sort, but in some cases, specifically consulting, the ROI could be measured in client dollars saved down the road via their increased understanding and knowledge of upper level SEO concepts and implementation planning. I meet a lot of aspiring SEOs who ask me how to make money doing client SEO and my answer is always short and simple: Provide more value than you charge. And as you get better at what you do, charge more because the amount of value you provide increases exponentially.

Darrin: When I started working on more SEO projects in the agency setting a few years back, I quickly discovered the importance of managing your client’s expectations and making sure you establish your ROI metric points before embarking on a campaign. A client who is trained to only watch their rankings is going to be a major headache for you because every time they change (and with Google, we’re seeing this more and more frequently) you’re going to get a panicked/angry/upset phone call telling you to fix everything or else you’re fired. I’ve found the most success with teaching my clients what we’re really trying to accomplish, whether that be encouraging more call-ins from a trackable phone number, more form submissions, more sales, etc. If you can get your clients to focus on the aggregate business growth rather than the individual ranking fluctuations, you’ll have much more success when consulting.

Everyone involved in SEO directly or from a consulting perspective uses tools, what are some of your favorites?

Darrin: I’m very much a fan of high power scraping and bulk analysis tools, so I’ve used Scrapebox and Xenu Link Sleuth quite a bit to find good commenting targets on large sites. I use SEScout.com to automatically track my own rankings because I can’t be bothered to either build my own rank tracker or remember to check them frequently enough. I used Market Samurai for quite a while but it just got to be so slow and really slowed down my ability to just get stuff done. In my darker days, I’ve used quite a few of the automated link building tools, as well as a large amount of my own bots.

Pasha: Great question. Safe to say I’ve either trialed or bought every SEO tool available over the last 5 years. Some of my favorite tools for processing data are scrapebox (+ scalc + textpad), Best Seo Suite, hrefer and of course, the Google Adwords keyword tool. For tracking rankings, the aptly named Rank Tracker from SEO Powersuite (editors note see Rank Tracker Review) seems as reliable as any while being relatively quick. I actually feel like the industry could really use a high quality rank tracking tool but that’s another story completely.

You’re involved with SerpIQ can you tell my readers what that tool is about, and what problems it solves.

Pasha: Well, like I mentioned above, we’ve used all the tools. And as SEOs who routinely got to battle in competitive high dollar niches, we were tired of competing using what was publicly available for keyword research and SEO competition analysis. We jokingly talked about building a better mousetrap for a couple years, but time never allowed. Then we teamed up on a couple projects where we needed to crunch a ridiculous amount of SERP data. Darrin just happens to be an amazing coder and when we worked out the timeline for the project, it made more sense for him to take the outline of our dream SEO tool and build the framework for our own internal use. Couple rough patches later, we had the keys to a super powerful SEO competition analysis tool capable of compiling 4 hours (at least) worth of manual data collection in under a minute. Extrapolate that times 1000 keywords and you can see how much time it saved us over the long haul. It was then a logical leap, especially when all our SEO friends were hassling us for access to it, to put a pretty face on it and a make it publicly available.

Time is the #1 problem serpIQ solves, allowing you to scale your SEO research and anlysis beyond anything you’ve seen before. Honestly, I can’t even begin to explain how much better the workflow inside serpIQ is compared to anything else. That’s why we decided to offer a free full featured trial, so people could see for themselves and we wouldn’t have to write a bunch of copy convincing them to try it.

Then, it lays it all out for you in a super easy to digest format that’s both stimulating to the left and right brain. No excel files here. So you’ve got all the pertinent SERP data in front of you, laid out in a way that an experienced SEO can easily spot the holes in any page 1 SERP and focus their gameplan on matching and exceeding the weak link to frontpage their site as quickly as possible.

Next, we added the Keyword Discovery module, as we got tired of bouncing in and out of serpIQ to do keyword research. Now you can search keyword volumes, see any exact match (.com, .net, .org) domains available and one click send them to the Competition Analyzer. This part is actually really dangerous if you’re not diligent with your time. It’d be easy to spend an afternoon in the Keyword Discovery and find more profitable projects than you can complete in a year. Trust me, I’ve done it too many times already.

Finally, we wanted to help our subscribers that do client work make more money faster. We just launched our agency plan, which allows for white labeled PDF report generation. It only takes about 45 seconds to generate the report, and you can upload a logo for the front page and the footer of each page has the contact info that the subscriber uploads (Company name, address, phone #, website, etc..). This basically allows our subscribers to produce a report that looks like it took days of research and construction in under a few minutes. I’m not saying they should sell it to their clients like they spent two days on it, but they could.

Darrin: Pasha really hit on the major main points for why we built serpIQ, but I’ll just add one more thing. When you’re learning how to be a good SEO, it’s really important to manually find each data point that matters for a ranking site so that you can see where everything is coming from and can start to understand how the puzzle pieces all fit together. However, once you’ve progressed to the point where you understand all of the important data points, the value you derive from manually collecting all of that data DRASTICALLY decreases, to the point where you’re literally wasting your time copying and pasting data into a spreadsheet to generate a report.

That is one of the major reasons we built serpIQ, it gives you back your time and helps you make money faster. There’s no value in manually compiling data once you understand it, you’re actually costing yourself money by doing it by hand. We want to fix that for a few bucks a day.

What do you feel are the best ways that people can use your tools to get the most out of them?

Pasha: There’s two distinct workflows for serpIQ. If you know the keyword you want to analyze the front page for, just type it into the Competition Analyzer, add your URL to compare against the top 10 (optional) and give it 20-60 seconds. This is good for client work, or if you’re focusing on a niche you’re already familiar with.

If you’re more in the affiliate space, or somebody new to the whole “make money online” game, you would start in the Keyword Discovery by entering either broad (how I start, personally) or specific keywords, allowing the tool to pull in relevant info. At this point you’d see all the keywords with exact match domains available, for the low hanging fruit, and then you can quickly click on keywords that look promising to see how difficult the front page will be to achieve. I made a video showing how I found a potentially profitable niche with easy to rank buying keywords in less than 5 minutes, I’ll apologize in advance for the video quality, I’m no guru, rather more an in the trenches type of guy. You can see it on Youtube

Click here to view the embedded video.

For the SEO agencies and SEOs doing client work out there, I’ve attached a PDF report for this site, based on the keyword in your meta title, SEO Blog (I’m assuming you don’t need to optimize for Michael Gray or Graywolf). <PDF Attachment>

I don’t do a lot of client work anymore but if this was available 2 years ago, I’d have done backflips to have access to something like this. Again, it goes back to providing value. I know one of our beta testers landed a $20,000 SEO contract utilizing screenshots from serpIQ (pre-pdf report days) and I’ve heard numerous other stories about 4 and 5 figure contracts gained from the first impression they made with our tool and reporting.

Darrin: When brainstorming features, since we both come from the Internet Marketing side of things, I wanted to make sure serpIQ was useful for both Affiliates and Internet Marketers as well as SEO Consultants and Agencies. We think the Keyword Discoveries tool is great for people looking for new niches to attack, but it can also be used by agencies and consultants to help laser focus a client’s campaign on the more attainable, profitable keywords.

And if you’re a consultant or agency, the PDF reports will simply blow away your clients. Each report rings in at about 12-15 pages long, and takes about a minute from initial keyword entry to clicking the PDF button to get one, all with your own logo on them (if you have the Agency account, otherwise it will have the serpIQ logo on it). Literally no other tool will give you so much detail, so fast, so professionally done, at our price points. One new client landed with a serpIQ report should pay for at least a year of our service.

Whenever a tool gets released to the public, it gets used in unexpected ways, or people ask for new features you didn’t think of, did that happen for you?

Darrin: I definitely underestimated the amount of international users we would have. I’m pretty sure we now have users in every major timezone in the world (which makes for fun times when scheduling webinars) and because of that, the demand for international Google support and competition analysis is high. Luckily we’ve been able to account for that very well and our international users have been quite happy with it.

As a developer, you can only test your own software so well before you reach your own limits, so when we clicked “go” and set things live, it was interesting to see the edge cases of users breaking things or trying things I would have never thought of trying. Our users are really responsive though and have been super helpful in helping us sort through any bugs and we’ve hardened the software quite a bit since launch. We’re now approaching about 45,000 keywords analyzed which is a crazy amount of data we’ve pulled in considering we’re analyzing the top ten results for each keyword, so we’re well over 400,000 sites analyzed now. It’s been a lot of fun.

When we were talking offline you mentioned you said you had some interesting data about the length of content, and it’s correlation to rankings, lets talk about that a bit.

Darrin: One of the benefits of running a web based service like serpIQ is that we’re able to anonymously aggregate user data to put together some pretty amazing histograms and statistical analyses of all of our data that has consistently flipped a lot of SEO maxims on their heads. As Pasha will go into more detail in a second, one of the most eye opening discoveries we found was with Content Length on #1 ranking sites. We’ve found that 75% of sites have at least 600-800 words on the front page (which means only 25% of #1 results are there with less than 600 words of content).

Click to Enlarge - Total Words and Rankings Graph

Pasha: Thank you for using correlation and not causation. As you can see here, http://blog.serpiq.com/pages/serpiq-snapshots we’ve pulled some average data for different metrics out of serpIQ and content length on the front page is substantially longer than the conventional wisdom that SEOs have been spouting for years, ie 500 words is sufficient. One thing about averages, is that the outliers really effect it and the 60,000 word Wikipedia page ranking front page is more than an outlier than the 200 word thin product site ranking front page. All things considered though, the median content length of a front page result in our dataset is ~1500 words. Needless to say, after seeing this data for the first time, I went back and buffed out a bunch of sites with an extra 1000 words or so of unique quality content and yes, the results were favorable. Obviously, there’s a lot more to the picture than content length than that but it’s interesting to see SEO urban myths debunked right before our eyes.

Let’s switch gears and look into the future, what do you think are some things that are on the horizon that will have a big effect on the internet marketing industry and people should be paying attention to?

Pasha: In a word, Facebook. In two words, Facebook and Bing. Obviously facebook has the userbase of all userbases, the only question is how they’ll respond to attempts to repurpose their time spent there. I know a lot of people who completely gave of advertising on Google to advertise on Facebook. And while Google has dominated the search landscape for a long time, change is on the wind. Google’s algorithm, quite frankly, doesn’t handle links very well. Even post Panda, it’s all too easy to linkspam your way above more high quality deserving sites. The average internet searcher, my Father In Law for example, is tired of wading throught the well SEO’d junk floating around the top of the Google SERPs, in fact I’ll quote him: “I use Bing because I can start on page 1, instead of page 3 or 4 to find what I want that isn’t crap.” Personally, I disagree, but I use Google for a lot more than just looking to buy stuff. I hope they get it straightened out but if they don’t, I look for Binghoo and Facebook to take enormous chunks of their virtual real estate in the next 12-24 months.

Darrin: I think mobile usage is going to continue to skyrocket which will start to shift a lot of things that have traditionally worked well online. Facebook is already seeing greater than 25% of their users are mobile, and I’m sure Google has comparable data. I also agree with Pasha that Facebook is going to start branching out more into the standard internet and away from their walled garden. They’re the ultimate retargeting machine, and with Like buttons on basically every site now, they have a HUGE amount of user data that can make for a very viable competitor to Adwords and Adsense. It could be really disruptive.

Thanks for taking the time to answer my questions today, since you’ve been kind enough to spend your time answering them, why don’t you take this opportunity to tell people about your company, and the services you offer.

Darrin: If you work at an agency or as a consultant, you are selling your time. SerpIQ will cut down a process that would traditionally take you all day to something that will take less than 5 minutes, letting you do more in less time and make more money with less work.

Pasha: Elevator pitch time, huh? SerpIQ, the best competition analyzer you’ll ever use but don’t take my word for it, go try it for free (100%, no credit card, just an email) right now and decide for yourself at SerpIQ.com If you decide you like it, we’ve got plans with varying amounts of features and volumes ranging from $29/month to $149/month.

Thanks for taking the time to talk with us today, Michael.

The preceding has been a sponsored post. Find out more information about sponsored posts.

photo credit: Shutterstock/sellingpix

tla starter kit

Related posts:

  1. Interview with Chase Granberry of AuthorityLabs.com The following is a sponsored post. For this post we’re...
  2. Lee Odden – Local Search Interview In this interview we’re going to be talking with Lee...
  3. Interview with Steven Wyer of Reputation Advocate The following is a sponsored post. For this post we’re...
  4. Interview with Nick Stamoulis of Brick Marketing The following is a sponsored post For this post we’re...
  5. Interview with Andrew Wise of SEOLinkWheelers.com The following is a sponsored post. For today’s post we’re...

Advertisers:

  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Ezilon.com Regional Directory - Check to see if your website is listed!
  4. Need an SEO Audit for your website, look at my SEO Consulting Services
  5. Link Building- Backlink Build offers customized link building services
  6. Directory Journal - Get permanent deep links in a search engine friendly directory
  7. LinkWheel SEO - Get Web 2.0 Backlinks
  8. RevSEO High PR BackLinks- Private High PageRank Homepage Link Network
  9. The #1 ranking SEO software toolkit: get your free download
  10. TigerTech - Great Web Hosting service at a great price.
  11. Article-Writing-services.org - Article Writing Services creates quality content for websites and blogs at no cost to site owners.
  12. Rouper.com - Buy & Sell Premium Websites
  13. Blurbpoint - is reputable SEO link building service provider. Get quality one way links with our complete link building packages.

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis Wordpress Theme review.

SerpIQ Interview

Optimising Your Site for Tablets

Tablets have become ubiquitous, so much so that any website owner would be foolish not to optimise their site to make it friendlier for tablet users. To help you get started here are some tips for redesigning your site and realigning your strategy to ensure you don’t miss out on the gaining popularity of tablets:

  • Convert Flash content to HTML5 – While other tablets may support Flash, iPad, which is still the most popular tablet, does not. If you want your site’s content to render correctly on an iPad, then you’d better start converting all that flashy Flash content to HTML5. Need more reason than the significant number of iPad users you’re missing out on? Then do it for your SEO efforts, after all flash is just about as search bot friendly as it is iPad friendly, which means not friendly at all.
  • Use larger fonts and buttons – Tablets may have smaller displays than computer screens, but the funny thing is that larger fonts work better (read: more user friendly) on mobile devices. Please your mobile users by making things easier on their eyes. It is also super annoying to keep on pressing the wrong button, so make buttons finger-friendly and make them as large as possible.
  • Offer a way for users to toggle between tablet view and normal view – Ideally this toggle alert should only be appear when the browser detects the tablet, but you can also just embed a toggle button on your site.
  • Avoid paginated pages – While clicking on next page or swiping a finger to get to the next page is not that hard, it will mean more waiting time for your user as the next page loads. Since scrolling down is pretty easy in tablets, you should opt for view all pages as much as possible.
  • Make your content downloadable – You don’t have to make everything downloadable, but at least choose the most popular or useful ones and make PDF downloadable versions for them. A lot of people use their tablets as ebook readers and would appreciate being able to read your content even if they are not connected to the Internet.
  • Be speed conscious – While studies show that tablet users are using their devices at home more and more, a significant amount of time spent on the tablet browsing is still done while on the go. This means that you have to make sure that your site has fast loading time if you want tablet users to stick around.
  • Simplify your checkout – This is important (especially for your conversion rate) since typing on a tablet is still not as easy as typing on your keyboard (unless they have an external keyboard attached of course). Make life easier for your users by simplifying your checkout and so avoid discouraging them from completing the purchase action.

 

Resources:

Start Thinking Mobile SEO by Steve Wiideman
SEO for the iPad by Phil Nottingham
Is It Time to Start Optimizing Your Sites for Tablets? by SEOPub

a

SEO Rankings Articles, Videos and News