Showing posts with label beginner. Show all posts
Showing posts with label beginner. Show all posts

Thursday, February 26, 2015

Finding more mobile-friendly search results

Finding more mobile-friendly search results

Webmaster level: all

When it comes to search on mobile devices, users should get the most relevant and timely results, no matter if the information lives on mobile-friendly web pages or apps. As more people use mobile devices to access the internet, our algorithms have to adapt to these usage patterns. In the past, we’ve made updates to ensure a site is configured properly and viewable on modern devices. We’ve made it easier for users to find mobile-friendly web pages and we’ve introduced App Indexing to surface useful content from apps. Today, we’re announcing two important changes to help users discover more mobile-friendly content:

1. More mobile-friendly websites in search results

Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high quality search results that are optimized for their devices.

To get help with making a mobile-friendly site, check out our guide to mobile-friendly sites. If you’re a webmaster, you can get ready for this change by using the following tools to see how Googlebot views your pages:

  • If you want to test a few pages, you can use the Mobile-Friendly Test.
  • If you have a site, you can use your Webmaster Tools account to get a full list of mobile usability issues across your site using the Mobile Usability Report.

2. More relevant app content in search results

Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.

If you have questions about either mobile-friendly websites or app indexing, we’re always happy to chat in our Webmaster Help Forum.


Wednesday, February 18, 2015

Case Studies: Fixing Hacked Sites

Case Studies: Fixing Hacked Sites

Webmaster Level: All

Every day, thousands of websites get hacked. Hacked sites can harm users by serving malicious software, collecting personal information, or redirecting them to sites they didn't intend to visit. Webmasters want to fix hacked sites quickly, but unfortunately recovering from a hack can be a complicated process.

We're trying to make the process of recovering from a hack easier for webmasters with features like Security Issues, Help for Hacked Sites, and a section of our forum just for hacked sites. Recently we talked to two webmasters with hacked sites to learn more about how they were able to fix their sites. We're sharing their stories with the hope that they might provide ideas to other webmasters who have been victims of hacking. We're also using these stories and other feedback for improving our documentation for hacked sites to make the process easier for everyone going forward.

Case Study #1: Restaurant website with multiple hack-injected scripts

A restaurant website using Wordpress received a message from Google in their Webmaster Tools account, alerting them that their site had been altered by hackers. To protect Google users, the website was labelled as hacked in Google's search results. The webmaster of the site, Sam, looked at the source code and noticed many unfamiliar links on the site with pharmaceuticals terms such as "viagra" and "cialis." She also noticed many pages where the meta description tags (in the HTML) had added content such as "buy valtrex in florida." There were also hidden div tags (also in the HTML) of many pages that linked to many sites. None of these links were added by Sam.

Sam removed all of the hacked content she found and filed a reconsideration request. The request was rejected but in the message she received from Google, she was advised to check for any unfamiliar scripts in the any PHP files (or any other server files), as well as changes to the .htaccess file. These files are likely to have scripts added by the hackers that modify the site. These scripts typically only show the hacked content to search engines, while hiding the content from a normal user. Sam checked out all of the .php files and compared them to the clean copies she had in her backup. She found new content added to her footer.php, index.php, and functions.php. When she replaced those files with the clean backups, she could no longer find any hacked content on her site. When she filed another reconsideration request, she got a response from Google notifying her that her site was free from hacked content!

Even though Sam had cleaned up the hacked content on her site, she knew that she would need to continue to secure her site against future attacks. She followed the steps below to keep her site safe in the future:

  • Keep the CMS (content management system like WordPress, Joomla, Drupal, etc) up to date with the most current version. Make sure plugins are up to date as well.
  • Make sure the account used to access the administrative features of the CMS uses a difficult and unique password.
  • If the CMS supports it, enable 2-step verification for login. (This might also be called two factor authentication or two step authentication.) This is recommended for the account being used for password recovery as well. Most email providers, like Google, Microsoft, Yahoo all support this!
  • Make sure the plugins and themes installed are from a reputable source - pirated plugins or themes can often contain code that makes it even easier for hackers to get in!

Case Study #2: Professional website with lots of hard to find hacked pages

A small business owner named Maria who also manages her own website received a message in her Webmaster Tools that her site was hacked. The message provided an example of a page added by hackers: http://example.com/where-to-buy-cialis-over-the-counter/. She talked to her hosting provider who looked at the source code on the homepage but could not find any pharmaceutical keywords. When the hosting provider visited http://example.com/where-to-buy-cialis-over-the-counter/, it returned an error page. Maria also bought a malware scanning service but the service was not able to find any malicious content on her site.

Maria then went to Webmaster Tools and used the Fetch as Google tool on the example URL Google had provided (http://example.com/where-to-buy-cialis-over-the-counter/) which returned no content. Confused, she filed a reconsideration request and received a rejection message which advised her to do two things:

  1. Verify the non-www version of her site as hackers often try to hide content in folders that may be overlooked by the webmaster.

    While it may seem like http://example.com and http://www.example.com are the same site, Google actually treats these as different sites. http://example.com is referred to as the "root domain" while http://www.example.com is called the subdomain. Maria had http://www.example.com verified but not http://example.com verified which is important because the pages added by hackers were non-www pages like http://example.com/where-to-buy-cialis-over-the-counter/. Once she verified http://example.com she was able to successfully see the hacked content on the provided URL with the Fetch as Google tool in Webmaster Tools.

  2. Check her .htaccess file for new rules.

    Maria talked to her hosting provider who showed her how to access her .htaccess file. She noticed right away that her .htaccess file had some strange content that she had not added:

    <IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteCond %{HTTP_USER_AGENT} (google|yahoo|msn|aol|bing) [OR]
    RewriteCond %{HTTP_REFERER} (google|yahoo|msn|aol|bing)
    RewriteRule ^([^/]*)/$ /main.php?p=$1 [L]
    </IfModule>

    The mod_rewrite rule you see above was inserted by the hacker and redirects anyone coming from certain search engines, as well as search engine crawlers, to main.php, which generates all of the hacked content. It's also possible that these rules can redirect users accessing the site on a mobile device. On the same day, she also saw that a recent malware scan found suspicious content on the main.php file. One top of that, she also noticed an unknown user in the ftp users area of her website development software.

She removed the main.php file, the .htaccess file, and removed the unknown user from her FTP users area and her site was no longer hacked!

Steps to prevent getting hacked in the future

  • Avoid using FTP when transferring files to your servers. FTP does not encrypt any traffic, including passwords. Instead, use SFTP, which will encrypt everything, including your password, as a protection against eavesdroppers examining network traffic.
  • Check the permissions on sensitive files like .htaccess. Your hosting provider may be able to assist you if you need help. The .htaccess file can be used to improve and protect your site, but it can also be used for malicious hacks if they are able to gain access to it.
  • Be vigilant and look for new and unfamiliar users in your administrative panel and any other place where there may be users that can modify your site.

We hope your site never gets hacked, but if it does, we have many resources for hacked webmasters on our Help for Hacked Sites page. If you need more help or would like to share your own tips, you can post in our Webmaster Help Forum. If you do post to the forum or submit a reconsideration request for your site, please include #NoHacked.

Wednesday, December 3, 2014

Are you a robot? Introducing “No CAPTCHA reCAPTCHA”

reCAPTCHA protects the websites you love from spam and abuse. So, when you go online—say, for some last-minute holiday shopping—you won't be competing with robots and abusive scripts to access sites. For years, we’ve prompted users to confirm they aren’t robots by asking them to read distorted text and type it into a box, like this:
But, we figured it would be easier to just directly ask our users whether or not they are robots—so, we did! We’ve begun rolling out a new API that radically simplifies the reCAPTCHA experience. We’re calling it the “No CAPTCHA reCAPTCHA” and this is how it looks:
On websites using this new API, a significant number of users will be able to securely and easily verify they’re human without actually having to solve a CAPTCHA. Instead, with just a single click, they’ll confirm they are not a robot.
A brief history of CAPTCHAs 

While the new reCAPTCHA API may sound simple, there is a high degree of sophistication behind that modest checkbox. CAPTCHAs have long relied on the inability of robots to solve distorted text. However, our research recently showed that today’s Artificial Intelligence technology can solve even the most difficult variant of distorted text at 99.8% accuracy. Thus distorted text, on its own, is no longer a dependable test.

To counter this, last year we developed an Advanced Risk Analysis backend for reCAPTCHA that actively considers a user’s entire engagement with the CAPTCHA—before, during, and after—to determine whether that user is a human. This enables us to rely less on typing distorted text and, in turn, offer a better experience for users.  We talked about this in our Valentine’s Day post earlier this year.

The new API is the next step in this steady evolution. Now, humans can just check the box and in most cases, they’re through the challenge.

Are you sure you’re not a robot?

However, CAPTCHAs aren't going away just yet. In cases when the risk analysis engine can't confidently predict whether a user is a human or an abusive agent, it will prompt a CAPTCHA to elicit more cues, increasing the number of security checkpoints to confirm the user is valid.
Making reCAPTCHAs mobile-friendly

This new API also lets us experiment with new types of challenges that are easier for us humans to use, particularly on mobile devices. In the example below, you can see a CAPTCHA based on a classic Computer Vision problem of image labeling. In this version of the CAPTCHA challenge, you’re asked to select all of the images that correspond with the clue. It's much easier to tap photos of cats or turkeys than to tediously type a line of distorted text on your phone.
Adopting the new API on your site

As more websites adopt the new API, more people will see "No CAPTCHA reCAPTCHAs".  Early adopters, like Snapchat, WordPress, Humble Bundle, and several others are already seeing great results with this new API. For example, in the last week, more than 60% of WordPress’ traffic and more than 80% of Humble Bundle’s traffic on reCAPTCHA encountered the No CAPTCHA experience—users got to these sites faster. To adopt the new reCAPTCHA for your website, visit our site to learn more.

Humans, we'll continue our work to keep the Internet safe and easy to use. Abusive bots and scripts, it’ll only get worse—sorry we’re (still) not sorry.


Monday, October 6, 2014

Bring your local business online -- no website required!

Bring your local business online -- no website required!

Webmaster Level: Beginner

“Hey, how do I get my business on the web?” Having worked at Google for nine years, if I had a penny for every time someone asked me that question… :) To answer, today we’re releasing a short video series (30 minutes total!), sharing the same advice we’d give to our friends and family. It’s the advice I’d give to my sister, Marnie, who owns a jewelry store, or my cousin, Scott, who works as a realtor. Video spoiler alert: You won’t need to make a website, but you definitely need a way for your local business to reach potential customers using their mobile phones, tablets, or desktop computers.

Video series to help local business owners of all technical levels to get their business found on the web. It focuses on the benefits of creating a Yelp business page, Facebook page, Google+ page, etc.

The great thing about video is that you can pause at any time and work at your own pace. Next time you hear the question: “How do I get my business on Google?”, please share the link and let's get more local businesses online!

Series: Build an online presence for your local business

Video #1: Introduction and hot topics (3:22)
Meet my sister, Marnie, who owns a jewelry store and my cousin, Scott, who works as a realtor. Follow them as we talk about the big changes in the last decade, such as making sure your business can reach customers at work, home, or on-the-go using their mobile phones.
Video #2: Determine your business’ value-add and online goal (4:08)
With the example of Scott, the realtor, you’ll learn about the marketing funnel, setting an online goal, and highlighting what makes your business special.
Video #3. Find potential customers (7:41)
Marnie and Scott figure out their customers’ most common journeys to reach their business. We'll use their examples to brainstorm how you can reach customers on review sites, through search engines, maps apps, and social and professional networking sites.
Video #4: Basic implementation and best practices (5:23)
The fundamentals and best practices to take your business from offline to online!
Video #5: Differentiate your business from the competition (5:09)
With Scott’s business as a realtor, see how to demonstrate that your local business is the best choice for customers by adding photos, videos, and getting reviews.
Video #6: Engage customers with a holistic online identity (4:51)
We'll end the series by showing how Scott makes sure his online presence sends a cohesive message to customers and answers all their common questions. :)

Monday, September 8, 2014

Webmaster Academy now available in 22 languages

Webmaster level: Beginner

Today, the new Webmaster Academy goes live in 22 languages! New or beginner webmasters speaking a multitude of languages can now learn the fundamentals of making a great site, providing an enjoyable user experience, and ranking well in search results. And if you think you’re already familiar with these topics, take the quizzes at the end of each module to prove it :).

So give Webmaster Academy a read in your preferred language and let us know in the comments or help forum what you think. We’ve gotten such great and helpful feedback after the English version launched this past March so we hope this straightforward and easy-to-read guide can be helpful (and fun!) to everyone.

Let’s get great sites and searchable content up and running around the world.

Monday, August 25, 2014

#NoHacked: a global campaign to spread hacking awareness

Webmaster level: All

This June, we introduced a weeklong social campaign called #NoHacked. The goals for #NoHacked are to bring awareness to hacking attacks and offer tips on how to keep your sites safe from hackers.

We held the campaign in 11 languages on multiple channels including Google+, Twitter and Weibo. About 1 million people viewed our tips and hundreds of users used the hashtag #NoHacked to spread awareness and to share their own tips. Check them out below!

Posts we shared during the campaign:


Some of the many tips shared by users across the globe:
  • Pablo Silvio Esquivel from Brazil recommends users not to use pirated software (source)
  • Rens Blom from the Netherlands suggests using different passwords for your accounts, changing them regularly, and using an extra layer of security such as two-step authentication (source)
  • Дмитрий Комягин from Russia says to regularly monitor traffic sources, search queries and landing pages, and to look out for spikes in traffic (source)
  • 工務店コンサルタント from Japan advises everyone to choose a good hosting company that's knowledgeable in hacking issues and to set email forwarding in Webmaster Tools (source)
  • Kamil Guzdek from Poland advocates changing the default table prefix in wp-config to a custom one when installing a new WordPress to lower the risk of the database from being hacked (source)

Hacking is still a surprisingly common issue around the world so we highly encourage all webmasters to follow these useful tips. Feel free to continue using the hashtag #NoHacked to share your own tips or experiences around hacking prevention and awareness. Thanks for supporting the #NoHacked campaign!

And in the unfortunate event that your site gets hacked, we’ll help you toward a speedy and thorough recovery:

Wednesday, August 6, 2014

HTTPS as a ranking signal

Webmaster level: all

Security is a top priority for Google. We invest a lot in making sure that our services use industry-leading security, like strong HTTPS encryption by default. That means that people using Search, Gmail and Google Drive, for example, automatically have a secure connection to Google.

Beyond our own stuff, we’re also working to make the Internet safer more broadly. A big part of that is making sure that websites people access from Google are secure. For instance, we have created resources to help webmasters prevent and fix security breaches on their sites.

We want to go even further. At Google I/O a few months ago, we called for “HTTPS everywhere” on the web.

We’ve also seen more and more webmasters adopting HTTPS (also known as HTTP over TLS, or Transport Layer Security), on their website, which is encouraging.

For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.


Lock


In the coming weeks, we’ll publish detailed best practices (it's in our help center now) to make TLS adoption easier, and to avoid common mistakes. Here are some basic tips to get started:

  • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
  • Use 2048-bit key certificates
  • Use relative URLs for resources that reside on the same secure domain
  • Use protocol relative URLs for all other domains
  • Check out our Site move article for more guidelines on how to change your website’s address
  • Don’t block your HTTPS site from crawling using robots.txt
  • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

If your website is already serving on HTTPS, you can test its security level and configuration with the Qualys Lab tool. If you are concerned about TLS and your site’s performance, have a look at Is TLS fast yet?. And of course, if you have any questions or concerns, please feel free to post in our Webmaster Help Forums.

We hope to see more websites using HTTPS in the future. Let’s all make the web more secure!

Monday, August 4, 2014

Introducing the Google News Publisher Center

Introducing the Google News Publisher Center

(Cross-posted on the Google News Blog)

Webmaster level: All

UPDATE: Great News -- The Publisher Center is now available in all countries where Google News has an edition.

If you're a news publisher, your website has probably evolved and changed over time -- just like your stories. But in the past, when you made changes to the structure of your site, we might not have discovered your new content. That meant a lost opportunity for your readers, and for you. Unless you regularly checked Webmaster Tools, you might not even have realized that your new content wasn’t showing up in Google News. To prevent this from happening, we are letting you make changes to our record of your news site using the just launched Google News Publisher Center.

With the Publisher Center, your potential readers can be more informed about the articles they’re clicking on and you benefit from better discovery and classification of your news content. After verifying ownership of your site using Google Webmaster Tools, you can use the Publisher Center to directly make the following changes:

  • Update your news site details, including changing your site name and labeling your publication with any relevant source labels (e.g., “Blog”, “Satire” or “Opinion”)
  • Update your section URLs when you change your site structure (e.g., when you add a new section such as http://example.com/2014commonwealthgames or http://example.com/elections2014)
  • Label your sections with a specific topic (e.g., “Technology” or “Politics”)

Whenever you make changes to your site, we’d recommend also checking our record of it in the Publisher Center and updating it if necessary.

Try it out, or learn more about how to get started.

At the moment the tool is only available to publishers in the U.S. but we plan to introduce it in other countries soon and add more features.  In the meantime, we’d love to hear from you about what works well and what doesn’t. Ultimately, our goal is to make this a platform where news publishers and Google News can work together to provide readers with the best, most diverse news on the web.

Tuesday, May 27, 2014

Rendering pages with Fetch as Google

Webmaster level: all

The Fetch as Google feature in Webmaster Tools provides webmasters with the results of Googlebot attempting to fetch their pages. The server headers and HTML shown are useful to diagnose technical problems and hacking side-effects, but sometimes make double-checking the response hard: Help! What do all of these codes mean? Is this really the same page as I see it in my browser? Where shall we have lunch? We can't help with that last one, but for the rest, we've recently expanded this tool to also show how Googlebot would be able to render the page.

Viewing the rendered page

In order to render the page, Googlebot will try to find all the external files involved, and fetch them as well. Those files frequently include images, CSS and JavaScript files, as well as other files that might be indirectly embedded through the CSS or JavaScript. These are then used to render a preview image that shows Googlebot's view of the page.

You can find the Fetch as Google feature in the Crawl section of Google Webmaster Tools. After submitting a URL with "Fetch and render," wait for it to be processed (this might take a moment for some pages). Once it's ready, just click on the response row to see the results.

Fetch as Google

Handling resources blocked by robots.txt

Googlebot follows the robots.txt directives for all files that it fetches. If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that's disallowing Googlebot's crawling of them), we won't be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won't be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we'll show them below the preview image.

We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site's visible content, or to its layout. That will make Fetch as Google easier for you to use, and will make it possible for Googlebot to find and index that content as well. Some types of content – such as social media buttons, fonts or website-analytics scripts – tend not to meaningfully contribute to the visible content or layout, and can be left disallowed from crawling. For more information, please see our previous blog post on how Google is working to understand the web better.

We hope this update makes it easier for you to diagnose these kinds of issues, and to discover content that's accidentally blocked from crawling. If you have any comments or questions, let us know here or drop by in the webmaster help forum.

Monday, May 19, 2014

Making your site more mobile-friendly with PageSpeed Insights

Webmaster level: all


To help developers and webmasters make their pages mobile-friendly, we recently updated PageSpeed Insights with additional recommendations on mobile usability.




Poor usability can diminish the benefits of a fast page load. We know the average mobile page takes more than 7 seconds to load, and by using the PageSpeed Insights tool and following its speed recommendations, you can make your page load much faster. But suppose your fast mobile site loads in just 2 seconds instead of 7 seconds. If mobile users still have to spend another 5 seconds once the page loads to pinch-zoom and scroll the screen before they can start reading the text and interacting with the page, then that site isn’t really fast to use after all. PageSpeed Insights’ new User Experience rules can help you find and fix these usability issues.

These new recommendations currently cover the following areas:
  • Configure the viewport: Without a meta-viewport tag, modern mobile browsers will assume your page is not mobile-friendly, and will fall back to a desktop viewport and possibly apply font-boosting, interfering with your intended page layout. Configuring the viewport to width=device-width should be your first step in mobilizing your site.

  • Size content to the viewport: Users expect mobile sites to scroll vertically, not horizontally. Once you’ve configured your viewport, make sure your page content fits the width of that viewport, keeping in mind that not all mobile devices are the same width.

  • Use legible font sizes: If users have to zoom in just to be able read your article text on their smartphone screen, then your site isn’t mobile-friendly. PageSpeed Insights checks that your site’s text is large enough for most users to read comfortably.
  • Size tap targets appropriately: Nothing’s more frustrating than trying to tap a button or link on a phone or tablet touchscreen, and accidentally hitting the wrong one because your finger pad is much bigger than a desktop mouse cursor. Make sure that your mobile site’s touchscreen tap targets are large enough to press easily.
  • Avoid plugins: Most smartphones don’t support Flash or other browser plugins, so make sure your mobile site doesn't rely on plugins.
These rules are described in more detail in our help pages. When you’re ready, you can test your pages and the improvements you make using the PageSpeed Insights tool. We’ve also updated PageSpeed Insights to use a mobile friendly design, and we’ve translated our documents into additional languages.

As always, if you have any questions or feedback, please post in our discussion group.

Wednesday, April 30, 2014

Webmaster Guidelines for sneaky redirects updated

Webmaster Guidelines for sneaky redirects updated

Webmaster Level: All

Redirects are often used by webmasters to help forward visitors from one page to another. They are a normal part of how the web operates, and are very valuable when well used. However, some redirects are designed to manipulate or deceive search engines or to display different content to human users than to search engines. Our quality guidelines strictly forbid these kinds of redirects.

For example, desktop users might receive a normal page, while hackers might redirect all mobile users to a completely different spam domain. To help webmasters better recognize problematic redirects, we have updated our quality guidelines for sneaky redirects with examples that illustrate redirect-related violations.

We have also updated the hacked content guidelines to include redirects on compromised websites. If you believe your site has been compromised, follow these instructions to identify the issues on your site and fix them.

As with any violation of our quality guidelines, we may take manual action, including removal from our index, in order to maintain the quality of the search results. If you have any questions about our guidelines, feel free to ask in our Webmaster Help Forum.


Tuesday, April 22, 2014

Introducing our global Google+ page for webmasters

Webmaster Level: All

We’ve recently launched our global Google Webmasters Google+ page. Have you checked it out yet? Our page covers a plethora of topics:
Follow us at google.com/+GoogleWebmasters and let us know in the comments what else you’d like to see on our page! If you speak Italian, Japanese, Russian or Spanish, be sure to also join one of our webmaster communities to stay up-to-date on language and region-specific news.

Tuesday, March 18, 2014

Introducing the new Webmaster Academy

Webmaster level: Beginner

Webmaster Academy logo

Our Webmaster Academy is now available with new and targeted content!

Two years ago, Webmaster Academy launched to teach new and beginner webmasters how to make great websites. In addition to adding new content, we've now expanded and improved information on three important topics:
  • Making a great site that’s valuable to your audience (Module 1)
  • Learning how Google sees and understands your site (Module 2)
  • Communicating with Google about your site (Module 3)
If you often find yourself overwhelmed by the depth or breadth of our resources, Webmaster Academy will help you understand the basics of creating a website and having it found in Google Search. If you’re an experienced webmaster, you might learn something new too.

Enjoy, learn, and share your feedback!

Wednesday, March 12, 2014

Musical artists: your official tour dates in the Knowledge Graph

Webmaster level: all

tour dates online

When music lovers search for their favorite band on Google, we often show them a Knowledge Graph panel with lots of information about the band, including the band’s upcoming concert schedule. It’s important to fans and artists alike that this schedule be accurate and complete. That’s why we’re trying a new approach to concert listings. In our new approach, all concert information for an artist comes directly from that artist’s official website when they add structured data markup.

If you’re the webmaster for a musical artist’s official website, you have several choices for how to participate:

  1. You can implement schema.org markup on your site. That’s easier than ever, since we’re supporting the new JSON-LD format (alongside RDFa and microdata) for this feature.
  2. Even easier, you can install an events widget that has structured data markup built in, such as Bandsintown, BandPage, ReverbNation, Songkick, or GigPress.
  3. You can label the site’s events with your mouse using Google’s point-and-click webmaster tool: Data Highlighter.

All these options are explained in detail in our Help Center. If you have any questions, feel free to ask in our Webmaster Help forums. So don’t you worry `bout a schema.org/Thing ... just mark up your site’s events and let the good schema.org/Times roll!


Thursday, February 27, 2014

3 tips to find hacking on your site, and ways to prevent and fix it



Google shows this message in search results for sites that we believe may have been compromised.You might not think your site is a target for hackers, but it's surprisingly common. Hackers target large numbers of sites all over the web in order to exploit the sites' users or reputation.

One common way hackers take advantage of vulnerable sites is by adding spammy pages. These spammy pages are then used for various purposes, such as redirecting users to undesired or harmful destinations. For example, we’ve recently seen an increase in hacked sites redirecting users to fake online shopping sites.

Once you recognize that your website may have been hacked, it’s important to diagnose and fix the problem as soon as possible. We want webmasters to keep their sites secure in order to protect users from spammy or harmful content.

3 tips to help you find hacked content on your site

  1. Check your site for suspicious URLs or directories
    Keep an eye out for any suspicious activity on your site by performing a “site:” search of your site in Google, such as [site:example.com]. Are there any suspicious URLs or directories that you do not recognize?

    You can also set up a Google Alert for your site. For example, if you set a Google Alert for [site:example.com (viagra|cialis|casino|payday loans)], you’ll receive an email when these keywords are detected on your site.

  2. Look for unnatural queries on the Search Queries page in Webmaster Tools
    The Search Queries page shows Google Web Search queries that have returned URLs from your site. Look for unexpected queries as it can be an indication of hacked content on your site.

    Don’t be quick to dismiss queries in different languages. This may be the result of spammy pages in other languages placed on your website.


    Example of an English site hacked with Japanese content.
  3. Enable email forwarding in Webmaster Tools
    Google will send you a message if we detect that your site may be compromised. Messages appear in Webmaster Tools’ Message Center but it's a best practice to also forward these messages to your email. Keep in mind that Google won’t be able to detect all kinds of hacked content, but we hope our notifications will help you catch things you may have missed.

Tips to fix and prevent hacking

  • Stay informed
    The Security Issues section in Webmaster Tools will show you hacked pages that we detected on your site. We also provide detailed information to help you fix your hacked site. Make sure to read through this documentation so you can quickly and effectively fix your site.

  • Protect your site from potential attacks
    It's better to prevent sites from being hacked than to clean up hacked content. Hackers will often take advantage of security vulnerabilities on commonly used website management software. Here are some tips to keep your site safe from hackers:

    • Always keep the software that runs your website up-to-date.
    • If your website management software tools offer security announcements, sign up to get the latest updates.
    • If the software for your website is managed by your hosting provider, try to choose a provider that you can trust to maintain the security of your site.

We hope this post makes it easier for you to identify, fix, and prevent hacked spam on your site. If you have any questions, feel free to post in the comments, or drop by the Google Webmaster Help Forum.

If you find suspicious sites in Google search results, please report them using the Spam Report tool.

Monday, January 27, 2014

Affiliate programs and added value

Affiliate programs and added value

Webmaster level: All

Our quality guidelines warn against running a site with thin or scraped content without adding substantial added value to the user. Recently, we’ve seen this behavior on many video sites, particularly in the adult industry, but also elsewhere. These sites display content provided by an affiliate program—the same content that is available across hundreds or even thousands of other sites.

If your site syndicates content that’s available elsewhere, a good question to ask is: “Does this site provide significant added benefits that would make a user want to visit this site in search results instead of the original source of the content?” If the answer is “No,” the site may frustrate searchers and violate our quality guidelines. As with any violation of our quality guidelines, we may take action, including removal from our index, in order to maintain the quality of our users’ search results. If you have any questions about our guidelines, you can ask them in our Webmaster Help Forum.

Tuesday, January 7, 2014

Improved Search Queries stats for separate mobile sites

Webmaster Level: All

Search Queries in Webmaster Tools just became more cohesive for those who manage a mobile site on a separate URL from desktop, such as mobile on m.example.com and desktop on www. In Search Queries, when you view your m. site* and set Filters to “Mobile,” from Dec 31, 2013 onwards, you’ll now see:
  • Queries where your m. pages appeared in search results for mobile browsers
  • Queries where Google applied Skip Redirect. This means that, while search results displayed the desktop URL, the user was automatically directed to the corresponding m. version of the URL (thus saving the user from latency of a server-side redirect).

Skip Redirect information (impressions, clicks, etc.) calculated with mobile site.

Prior to this Search Queries improvement, Webmaster Tools reported Skip Redirect impressions with the desktop URL. Now we’ve consolidated information when Skip Redirect is triggered, so that impressions, clicks, and CTR are calculated solely with the verified m. site, making your mobile statistics more understandable.

Best practices if you have a separate m. site

Here are a few search-friendly recommendations for those publishing content on a separate m. site:
  • Follow our advice on Building Smartphone-Optimized Websites
    • On the desktop page, add a special link rel="alternate" tag pointing to the corresponding mobile URL. This helps Googlebot discover the location of your site's mobile pages.
    • On the mobile page, add a link rel="canonical" tag pointing to the corresponding desktop URL.
    • Use the HTTP Vary: User-Agent header if your servers automatically redirect users based on their user agent/device.
  • Verify ownership of both the desktop (www) and mobile (m.) sites in Webmaster Tools for improved communication and troubleshooting information specific to each site.
* Be sure you've verified ownership for your mobile site!

Tuesday, December 24, 2013

So long, 2013, and thanks for all the fish

Now that 2013 is almost over, we'd love to take a quick look back, and venture a glimpse into the future. Some of the important topics on our blog from 2013 were around mobile, internationalization, and search quality in general. Here are some of the most popular new posts from this year:

It's been a busy year here on the blog. We hope that our posts here have helped to make these - sometimes complex - topics a bit easier to understand. Is there anything you would have wanted more information about? Let us know in the comments!

Our Help Forum and office hours hangouts have also been a place for helpful, insightful, and sometimes controversial discussions. It's not always easy to find ways to improve websites, or to solve technical & usability issues that users post about, so we're extremely thankful to have such a fantastic group of Top Contributors that give advice and provide feedback there.



Where are we headed in 2014? Only time will tell, but I'm sure we'll see more information for the general webmaster, hard-core technical advice, ways to make mobile sites even better, rockin' Webmaster Tools updates, tips on securing your site & its connections, and more. Are you ready? Don't forget your towel & let's go!


On behalf of all the webmaster help forum guides, we wish you happy holidays & a great 2014.


Wednesday, December 18, 2013

Improving URL removals on third-party sites

Webmaster level: all

Content on the Internet changes or disappears, and occasionally it's helpful to have search results for it updated quickly. Today we launched our improved public URL removal tool to make it easier to request updates based on changes on other people's websites. You can find it at https://www.google.com/webmasters/tools/removals


This tool is useful for removals on other peoples' websites. You could use this tool if a page has been removed completely, or if it was just changed and you need to have the snippet & cached page removed. If you're the webmaster of the site, then using the Webmaster Tools URL removal feature is faster & easier.

How to request a page be removed from search results

If the page itself was removed completely, you can request that it's removed from Google's search results. For this, it's important that the page returns the proper HTTP result code (403, 404, or 410), has a noindex robots meta tag, or is blocked by the robots.txt (blocking via robots.txt may not prevent indexing of the URL permanently). You can check the HTTP result code with a HTTP header checker. While we attempt to recognize "soft-404" errors, having the website use a clear response code is always preferred. Here's how to submit a page for removal:
  1. Enter the URL of the page. As before, this needs to be the exact URL as indexed in our search results. Here's how to find the URL.
  2. The analysis tool will confirm that the page is gone. Confirm the request to complete the submission.
  3. There's no step three!

How to request a page's cache & snippet be removed from search results

If the page wasn't removed, you can also use this tool to let us know that a text on a page (such as a name) has been removed or changed. It'll remove the snippet & cached page in Google's search results until our systems have been able to reprocess the page completely (it won't affect title or ranking). In addition to the page's URL, you'll need at least one word that used to be on the page but is now removed. You can learn more about cache removals in our Help Center.
  1. Enter the URL of the page which has changed. This needs to be the exact URL as indexed in our search results. Here's how to find the URL.
  2. Confirm that the page has been updated or removed, and confirm that the cache & snippet are outdated (do not match the current content).
  3. Now, enter a word that no longer appears on the live page, but which is still visible in the cache or snippet. See our previous blog post on removals for more details.

You can find out more about URL removals in our Help Center, as well as in our earlier blog posts on removing URLs & directories, removing & updating cached content, removing content you don't own, and tracking requests + what not to remove.

We hope these changes make it easier for you to submit removal requests! We welcome your feedback in our removals help forum category, where other users may also be able to help with more complicated removal issues.