If you’ve never heard of it, the Google Font directory is an online repository of free, open-source fonts that have been web optimized and ready for your use. There are hundreds of them from Display to Handwriting, Serif to Sans Serif. The Sans Serif is the home of Open Sans by Steve Matteson.
http://www.google.com/fonts#Analytics:total
With the release of Analytics data on each fonts use, Steve has staked his claim as king of the free fonts by taking a commanding lead for both first and second spots for most used fonts….he actually has the number four position as well. If you select the one year time frame you can see Open Sans has 61 Billion views over the past year. Now that’s some big numbers.
We’ve used the Google Webfont directory for quite a few projects now. The selection is massive, the reliability is high (with Google doing all the hosting) and most importantly it saves our clients time and money. Have you used Open Sans or any other font in this directory? Which is your favorite?
For years now Matt Cutts has been dishing out helpful advice and insights from the mind of Google to webmasters through the Google Webmaster Help YouTube channel. Typically theses short Q&A style videos provide a quick answer in less then 5 minutes so you can pick up a new tip and continue on your merry webmastering way. I frequently search back into old videos to fact check and confirm dates when certain things were said pertaining to Google and optimizing websites for maximum potential. It doesn’t take too much effort to track down what I’m looking for but as always, I love a faster way to do things.
Today Click Consult launched a website that brings you even closer to the goal of instant gratification when it comes to digesting the information found in this channel. Ladies and Gents….I present to you “www.theshortcutts.com”
The purpose of this website is pretty self explanatory. Archive the information presented in these Google Webmaster Help videos into short quick text answers. Now you save yourself the 1-2 minutes of an already abbreviated Q&A session with Matt Cutts on all sorts of topics concerning webmastering. I’m kicking myself for not thinking of this myself. It’s absolutely brilliant. From the pun used in the domain down to the ability to filter videos by the color shirt Matt is wearing! So go spend 5 minutes on the site and walk away with an accelerated education.
]]>Today Google released a new Glass video simulating (yes sorry to disappoint but its still a simulation video no matter how accurate the UI might be) what it is like to wear Google’s new augmented reality device Glass.
One of the brilliant moves Google made with this video release is the announcement that they were holding a contest to open up 8,000 more developer units (previously only available to those who opted in to buy one ($1,500) at Google I/O 2012. All you have to do is enter for free…
Using Google+ or Twitter, tell us what you would do if you had Glass, starting with the hashtag #ifihadglass.
http://www.google.com/glass/start/how-to-get-one/
I’m seeing plenty of entries on Google Plus about this where it is currently a trending topic. Twitter didn’t show the hash tag it as a trending but the term “Google Glass” was on the list. The smart part about this whole thing is this week gives just enough boost to carry buzz as we move towards Google I/O 13 and you are making that happen for free basically. Sure Google wants to get a few more units into the hands of people that would develop something cool but most people aren’t even reading the fine print of still having to shell out the cash if your entry gets chosen. Google is getting free publicity through this contest and that to me was pretty genius.
While you still have to purchase these units if you win the #ifihadglass competition its still a great opportunity to get in on the ground floor of a device that could change the way we interact with the world. I myself plan to enter but doubt I’ll be chosen with the number of hopefuls posting entries every minute.
I generally have a hard time explaining what I do to people outside my business circles. Many family and friends know I do “computer stuff.” When I finally make it click for them using simple terms. I have to tell them “I make your website show up in Google more often” or “I can tell you how many people visited your website with on an AT&T iPhone from the state of Iowa in July of last year.” Once they understand that the requests start pouring in. And that’s perfectly fine because I’m usually first to help a friend in need out. So I figured I’d share with you the reader a few things I check first when looking at a new web analytics account. At Perficient when we do a full study for a client we dig in much deeper but at a high level these are a few things you can look at on your own.
There are many other things that you can do to get a good idea on how healthy a website is and the deeper you dive the more interesting things you can find. This is just a scratch on the surface of what can be learned from your analytics when viewed by professionals. At Perficient we take a deeper look into your analytics, package our findings up into an easy to read report and then recommend on what next steps you need to take in order to accomplish your goals.
]]>Facebook’s new Graph Search is a very interesting tool that can be used to dive into the Facebook data that is publicly available. It can also make it ridiculously easy to dig up embarrassing facts you’ve forgotten you’ve even said about yourself. So how will you use this new tool? Will you use it to connect with new people with similar interests as Facebook intends or will you use it to embarrass a distant acquaintance with their ignorance of how public their posts on Facebook really are?
Interested in networking? There are of course many different avenues for doing so online (LinkedIn being one of the most well known) however with the new release of the Facebook Graph Search finding people outside of your existing circle of friends is easy. A simple search of “People who I work with at Perficient and live nearby” or even add a further search filter with “and like Paddy’s Pub” so that you can plan a popular meet and greet.
The internet is famous for bringing the worst out in people and access to this level of data will show that(both for the people who think of these searches and for those who can be found by them). Just days after the launch of Facebook Graph Search we see screenshots of horrible cross referenced filters that produce results such as “Current employers of people who like racism” or “Single women who live nearby and who are interested in men and like Getting Drunk.” I’m sure you can think of a few creative searches on your own as well.
Image courtesy of http://actualfacebookgraphsearches.tumblr.com/
The internet is only going to know what you share with them. Never consider a post sacred or secret. You never really know who’s on the other end of the message and most of these mishaps can simply be a result of those who haven’t educated themselves. These people who end up in these searches are likely the victims of their own ignorance. The good news is Facebook Graph Search will obey the settings you have tied into your account. Review them and make sure that only the information you want public is available before you see yourself in a search result like those returned above.
]]>We marketers need to move at the speed of the internet. That pace is getting faster and faster every day. Unfortunately the larger and more robust your website is, the more difficult it can be to work with the IT team to push out new updates and changes to the code for your marketing needs. Code freezes while new site features are being deployed can ruin time sensitive campaigns. Some marketers have complained it can take up to 9 months before they are ready to consent to your requests. What do you think that kind of delay can do to your productivity as a marketer?
Its because of these hardships a solution called Tag Management has started to creep up the Marketer’s bag of tools. What Tag Management does is give you the ability to control the code on your website without needing to rely on your website administrators to make those changes. Tag Management works by placing a single piece of code on the website that works as a Container in which you can insert many different tracking tags. Once you have the Container tag placed on the website, you manage the tags that go into that container from a tag management interface elsewhere. Here you can load up several tags and write custom logic rules so that the Tag Management solution knows whether or not to fire the tracking tag code.
Late last year Google released their own Tag Management tool accurately named Google Tag Manager (GTM). This is completely free and if you think your business might benefit you may want to check this solution out. It’s not quite as matured as other Tag Management solutions out there now but they do have a list of features they promise are coming soon.
Today’s problem/solution asks the question how do you use Google Analytics’ site search reports when there is now query string pattern in the url?
First off you need administrator level access for access to these settings. The complete path looks like this:
Admin > {select account} > {select profile} > {select property} > Profile Settings > Site Search Settings
You’ll need to click on the Admin link on the top right of the orange bar and then select the account from the accounts list. Once your account is selected choose the property and then the profile you want to apply this to. Then you simply select Profile Settings and you’ll see the following under Site Search Settings.
From here the basic settings are quite simple. All you need to do is put in the query parameter that your website usually throws in the URL to designate the search term. The above screenshot is from a WordPress blog so I used “s”. Now Google Analytics will look in every url for “?s=search+terms” style strings and you will see “search terms” end up in the site search report.
Now that we know how it “should” work…how do we set things up when there is no search terms in the URL? What if performing a search on your website doesn’t affect the URL at all? This seems too simple to work when you think about it. We simply create our own query string with Google Analtyics. This is another solution we came up with for the client that had a complete POS system built with Ajax and never refreshes the URLs naturally. We again use the ability to send virtual pages to Google analytics only this time we are appending a search query along with the search term to the url. The code looks like this when you’re using analytics.js
ga('send', 'pageview', { page: '/searchresults?query=shoes' });
All you have to do is fire that off with the appropriate search term whenever your customer uses site search and you’ve formatted your URL to something Google Analytics will process within the site search settings. Google Analytics will receive the full URL including the parameter and its value. We then would go back into site search settings and simply add in “query” for the query parameter so that Google knows to strip out the following search term and place that data into the site search reporting.
]]>I’d like to introduce a new post style series to you that I think might be helpful. In this ongoing Problem\Solution series I plan on posing a question in which I did not know the answer. After performing a bit of research I come to a solution in which I share with you. I’m a big fan of short cuts when it comes problem solving so hopefully this post saves you time and effort as you pass off these ideas as your own…I mean, I would.
Today’s problem solves how to use web analytics to track non-websites. The current project I’m working on is an application where 95% of its functionality is performed with Ajax. In total there are three unique URLs in the entire application even though there could be 20+ “pages” of core information based on their functionality. So the problem is “How do we track an application powered by Ajax?” Well the solution is to force the application to behave like a website when it sends the data to web analytics. For this solution we’ll be using custom events and virtual pages. I’ll use Google Analytics in my code examples because its free and you too can play along.
Typically event tracking is a method to record interactions with website elements. You can put this on an affiliate link to alert you when the link was clicked and then match up later to your conversion to see how persuasive your copy is. You can use this to see how many times someone clicked on an image of a button that says “push me” but does absolutely nothing. You can even use this to see how users are navigating your “all flash” navigation (shudders). We’ll be using event tracking for cases closest to the latter. Typical websites have links that when you click on them they usually travel to a new page and change the uniform resource locator…yeah I know what URL stands for. Wanna fight about it? With Ajax elements you can send and retrieve data from a server in the background without ever having to change the displayed web page. Since we are not loading a new URL, you typically aren’t firing the analytics tracking code off and sending more data to Google Analytics. This means you aren’t being tracked. Counteract that with the following code on elements that you want to show up in your analytics (in our case every single link that the all Ajax application has to offer the user).
Keep in mind this is the new syntax if you are using the new analytics.js for tracking and not ga.js.
Virtual pages refer to sending pageviews to Google Analytics where none really exist. It’s a practice commonly used for organizing content such as downloads into a directory structure. What you would do is place this code onto a download link and then create a pageview for say “/virtual/whitepapers/analytics-lesson.pdf” and even though the PDF is downloaded, you see what looks like a pageview of your site to a directory that doesn’t exist. Then you can go into your goal settings and set a goal to watch for any “/virtual/whitepapers/” url to produce a report on how often your white papers are being downloaded. The code looks like this:
Since we’re dealing with an Ajax site that has only 2-3 unique URLs, when logically the content could be divided up into 20+ URLs, this allows you to see the content in a much more granular fashion. You can simply add this code to any link that would replace the overall page content with new information. This is the same technique you use for tracking Mobile Applications. If you have a game on your iPhone and a menu system that shows you Settings, High Score, Credits, etc… You simply divide up the application into sections that make sense and push them into pageviews following a logical URL structure.
So that’s how we solve the problem of tracking a site that powered by Ajax. Just dropping the analytics.js tag onto your site/application isn’t quite enough. You need to get your hands dirty and do some advanced configuration to ensure the data is logged correctly. Hopefully you find this format helpful as I have a few more problem/solution guides. Let me know if there are any other types of problems you have yet to solve and maybe we can put our heads together to come up with a solution!
I’d like to introduce a new post style series to you that I think might be helpful. In this ongoing Problem\Solution series I plan on posing a question in which I did not know the answer. After performing a bit of research I come to a solution in which I share with you. I’m a big fan of short cuts when it comes problem solving so hopefully this post saves you time and effort as you pass off these ideas as your own…I mean, I would.
Today’s problem solves how to use web analytics to track non-websites. The current project I’m working on is an application where 95% of its functionality is performed with Ajax. In total there are three unique URLs in the entire application even though there could be 20+ “pages” of core information based on their functionality. So the problem is “How do we track an application powered by Ajax?” Well the solution is to force the application to behave like a website when it sends the data to web analytics. For this solution we’ll be using custom events and virtual pages. I’ll use Google Analytics in my code examples because its free and you too can play along.
Typically event tracking is a method to record interactions with website elements. You can put this on an affiliate link to alert you when the link was clicked and then match up later to your conversion to see how persuasive your copy is. You can use this to see how many times someone clicked on an image of a button that says “push me” but does absolutely nothing. You can even use this to see how users are navigating your “all flash” navigation (shudders). We’ll be using event tracking for cases closest to the latter. Typical websites have links that when you click on them they usually travel to a new page and change the uniform resource locator…yeah I know what URL stands for. Wanna fight about it? With Ajax elements you can send and retrieve data from a server in the background without ever having to change the displayed web page. Since we are not loading a new URL, you typically aren’t firing the analytics tracking code off and sending more data to Google Analytics. This means you aren’t being tracked. Counteract that with the following code on elements that you want to show up in your analytics (in our case every single link that the all Ajax application has to offer the user).
Keep in mind this is the new syntax if you are using the new analytics.js for tracking and not ga.js.
Virtual pages refer to sending pageviews to Google Analytics where none really exist. It’s a practice commonly used for organizing content such as downloads into a directory structure. What you would do is place this code onto a download link and then create a pageview for say “/virtual/whitepapers/analytics-lesson.pdf” and even though the PDF is downloaded, you see what looks like a pageview of your site to a directory that doesn’t exist. Then you can go into your goal settings and set a goal to watch for any “/virtual/whitepapers/” url to produce a report on how often your white papers are being downloaded. The code looks like this:
Since we’re dealing with an Ajax site that has only 2-3 unique URLs, when logically the content could be divided up into 20+ URLs, this allows you to see the content in a much more granular fashion. You can simply add this code to any link that would replace the overall page content with new information. This is the same technique you use for tracking Mobile Applications. If you have a game on your iPhone and a menu system that shows you Settings, High Score, Credits, etc… You simply divide up the application into sections that make sense and push them into pageviews following a logical URL structure.
So that’s how we solve the problem of tracking a site that powered by Ajax. Just dropping the analytics.js tag onto your site/application isn’t quite enough. You need to get your hands dirty and do some advanced configuration to ensure the data is logged correctly. Hopefully you find this format helpful as I have a few more problem/solution guides. Let me know if there are any other types of problems you have yet to solve and maybe we can put our heads together to come up with a solution!
Last October Google announced their next evolution in what is now known as Google Analytics. Google is currently in beta for a product they are calling Universal Analytics. The name chosen for this product is quite prophetic. With UA and the new measurement protocols and collection APIs, you can now measure just about anything and tie it all together (universally even) to your existing web analytics.
Imagine having data on cost analysis for not only Adwords but other networks you are working with as well. Now you can tie cost per conversion on your banner ad campaign to measure it’s success. Imagine comparing your conversions from your company’s mobile application to the same data your website produces all in the same report. Imagine tracking badge swipes for your building security doors to understand which areas of your office are trafficked the most and will make the most sense to post company memo’s to gain the most visibility. That last example seems kind of crazy but in the new future of Universal Analytics…this is entirely possible. It is a true marriage of offline data into the online world giving you the power of the Google Analytics reporting tools to make more informed decisions about how you run your business.
There was always a way to mash together this kind of information in past but never has Google actually provided the tools on their end to make it happen. The new tool is currently in beta but again another perk of my job as a consultant is to have the pleasure to work with big enterprise companies who get whitelisted into these kinds of betas. Right now I’m on a project that is using Universal Analytics to capture every interaction that happens in a point of sale system, gather up the data into a central computer from every POS terminal in every store, and then deliver the data each night to Google via the new Measurement Protocol. I’m learning lots of new tricks on this project so stay tuned for a new problem solving series of posts that cover such topics…
The good news is that Universal Analytics will eventually be available to all users. So for now take some time to read up on the new Measurement Protocol and start thinking of how you’ll use the new toys when they become available to everyone.
https://developers.google.com/analytics/devguides/collection/protocol/v1/
I encourage everyone to take a quick three minutes to review this quite humorous mashup video by Sam Applegate of http://www.ninja-creative.com. Matt Cutts is the head of webspam at Google. He’s basically Google personified when it comes to why a website ranks (or fails to rank) in search results pages. In his own words on his Google Plus profile
“I’m the head of the webspam team at Google. That means that if you type your name into Google and get porn back, it’s my fault. Unless you’re a porn star, in which case porn is a completely reasonable response.”
I’ve met Matt a few times myself and obviously you can see from his quote he does have a good sense of humor. That’s why the concept of taking a bunch of his Q&A videos on Google’s webmaster help youtube channel and mashing them up to give horrible advice is so hilarious.
Still I’d like to take some time to address the topics covered in this video and explain exactly why you would NOT want to do these things in order to gain favor from Google.
Ten years ago(in the stone ages of search engines) web pages were simply indexed and categorized based upon the content found on the page. Not much else influenced ranking at that time and by repeating a keyword topic over and over on the page, magically you became extremely relevant and an authority on the subject according to the simple (and woefully inadequate) ranking algorithms of that day.
Thankfully for us search engines have advanced to a point where the on page content is just one part of the equation. Your page ranking is determined by more than 200 unique signals which keywords on the page are but one part of.
How one website connects to another is what set Google apart from other search engines when it launched. Google looked at more than just the keywords on a page for ranking. Google developed a method in which they called PageRank (named after Google cofounder Larry Page) that would assign a value to your domain depending on how many pages linked to your domain and how “trusted” the linking website was. Getting a link from a .GOV or .EDU usually carry more weight. If you happen to have your website mentioned on CNN.com, you can bet you’ll see a boost in PageRank. On the other hand if all your incoming links are from the 3 P’s (Pills, porn and poker) than you will generally see a lower PageRank score.
Google never accepts payment to add a site to the index, update it more often, or improve its ranking. Some people still believe that favor is shown to websites that run Adsense on their websites. The thought process is Google would stand to increase profits for sending visitors more often to sites in which clicking on an Adsense link within that site would send money back to Google. This is simply not the case. Google is a money making machine as it still controls 67% of the entire US search engine market and it has country specific search engines in just about every country out there. They have no need to game the system.
If you’d like to learn more about how Google indexes and searches the web, check out their micro site on the subject. They’ve done such a good job of describing “how it all works” that anyone can understand the process.
http://www.google.com/competition/howgooglesearchworks.html
New year, new resolutions. Even if you are the type like me who hate the whole concept of January first being the de facto day for resolutions, you still can’t escape them. They are everywhere this time of year and the world is full of helpful tips and strategies on how to help you honor those poorly thought out promises you never should have made in the first place.
Even though I loath the practice of using a new year to state a self resolution, it’s still as good as anytime to start planning for the future. Here is my most important resolution this year…
Learn
It’s as simple as that. This year I’m going to focus on expanding my knowledge and in turn I’ll hopefully have plenty of exciting things to share with you. I’m going to read. I’m going to apply. Most important, I’m going to share things I didn’t know before hopping you too didn’t know these things. To start I’ll share something I learned at the beginning of 2012. Resolutions are hard to keep! Take a look at the following chart.
This is a snapshot of January traffic to a website I monitor. Notice what kind of traffic they get on January first? Its almost double what they saw weeks earlier in December and then it quickly trails off.
What kind of website you ask? …Health Club.