Ads 468x60px

ProBlogger: Filtering Out Google Analytics Junk to Read Your Numbers Better

ProBlogger: Filtering Out Google Analytics Junk to Read Your Numbers Better

Link to @ProBlogger

Filtering Out Google Analytics Junk to Read Your Numbers Better

Posted: 15 Jan 2015 06:06 AM PST

This is a guest contribution from Larry Alton.

Web developers, content managers, marketing teams, and many other online professionals rely on Google Analytics to understand visitor trends. However, you can run into a significant amount of noise, which can skew your Google Analytics numbers and your subsequent interpretations of this data.

Luckily, you can filter out certain types of traffic, so that your numbers don’t get watered down by your own traffic, Web crawlers, or duplicated because of web address letter case discrepancies. Here are three main filters to consider setting as you move forward with a Google analytics strategy.

Cutting Out Internal Traffic

Every time you and your colleagues navigate throughout your website, it can skew your traffic numbers. Luckily, you can filter these out of your Google Analytics reports, so that you get a more accurate representation of your traffic.

Just head over to your Admin page and select “Filters” under the “View” column. Next, click on “+New Filter” and make sure that the “Create New Filter” bubble is selected.

Name your filter something like “Exclude office traffic” or “Exclude home traffic.” Choose the “Custom Filter” option, then select “IP address” from the dropdown menus.

When you enter the IP address in the Filter pattern field, you’ll need to use backslashes before each dot, according to Google’sregular expressions requirements.   

Excluding Bots and Spiders

It can be extremely frustrating to examine your web traffic data, only to see that certain recurring bots and spiders are accountable to a large chunk of the pie. Luckily, Google istaking proactive measures to protect Analytics users from these annoyances.

You can opt into Google’s automated bot and spider filtering by going to your Admin panel, clicking on “Reporting View Settings” and checking off the box that reads, “Exclude all hits from known bots and spiders.” However, some bots and spiders will still be able to leak through. You can target these individual irritants by creating a new filter, selecting “Custom” and then choosing “Visitor ISP Organization.” Then enter the service provider of the bot using a regular expression.

Keep an eye on your analytics, and be sure to create manual filters for additional bots that attempt to sneak past you. This can prevent bothersome bots and spiders from skewing your website’s data.

Enforcing Lowercase

If visitors enter an URL into their browser or click links that use a mix of uppercase and lowercase characters, then you could wind up with duplicate Google Analytics entries for the same destination. Luckily, you can fix this issue by creating a filter.

Just create a brand new filter and call it something like “Force Lowercase.” Choose “Custom,” click on the “Lowercase” bubble, and select “Request URI.” Once this is done, you should stop seeing multiple entries when browsers load up a page using different letter cases.

Increase the accuracy of your Google Analytics traffic data by using filters to cut through the noise. Don’t allow your metrics to become skewed by your own internal traffic, spiders and bots, or by web addresses that contain a mixture of letter cases.

Larry Alton is an independent business consultant specializing in social media trends, business, and entrepreneurship. Follow him on Twitter and LinkedIn.

 

Originally at: Blog Tips at ProBlogger
Build a Better Blog in 31 Days

Filtering Out Google Analytics Junk to Read Your Numbers Better

10 Tips About Local SEO - DailyBlogTips

10 Tips About Local SEO - DailyBlogTips


10 Tips About Local SEO

Posted: 15 Jan 2015 06:50 AM PST

Local SEO can be very profitable, yet most bloggers and webmasters ignore it. If you are not familiar with the term, it refers to optimizing your website or application to search queries that have a strong local component. For instance, if someone searches for “good dentist”, Google will assume that the person is looking for a good dentist close to his or her location, and will therefore adapt the search results to that constraint.

With the explosion of mobile phones, more and more search queries are getting influenced by local SEO, and that is why you should learn about it. A couple of weeks ago I came across an interesting article on Search Engine Land titled 10 Things I Learned About Local SEO In 2014. Here’s a quotation:

8. The World Really Is Going Mobile
We all talk about it, but this past Black Friday/Cyber Monday, we saw a 100+% increase in mobile traffic to our local retailer clients. According to one national/local client that has a partnership with Google, Google has told them to expect that 80% of their organic traffic in 2015 will come from mobile search.

The challenge is that, as more searches go mobile, conversion can decline. This is particularly true if your site is not mobile optimized and definitely true if you are selling stuff versus just trying to generate phone calls.

If you have not started your mobile-friendly strategy yet, what are you waiting for?

I liked the article because the tips mentioned are not the usual stuff you’ll find on other guides. Check it out if you have some time, and to explore more the local SEO concept if your online business is affected by it.

Wanna learn how to make more money with your website? Check the Online Profits training program!