• Social Media Optimization

    Social media optimization (SMO) is the use of a number of social media outlets and communities to generate publicity to increase the awareness of a product, brand or event....

  • Facebook

    Facebook (stylized as facebook) is a for-profit corporation and online social networking service based in Menlo Park, California, United States...

  • Youtube

    Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube...

Tuesday, 23 August 2016

Google warns it will crack down on “intrusive interstitials” in January

Google warns it will crack down on “intrusive interstitials” in January

Google has announced that it will begin cracking down on “intrusive interstitials” on January 10, 2017, because this type of ad “can be problematic on mobile devices where screens are often smaller.”
Google will be potentially penalizing — i.e., lowering the rankings — of these web pages. Google said “pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly.”google-penalty-jail-ss-1920
Google explained which types of interstitials are going to be problematic, including:

Saturday, 6 August 2016

Google app now delivering restaurant reviews & best-of lists for local searches

Starting today in the U.S., Google says food- and drink-related searches will now return reviews from top critics and include best-of lists.

maps-commerce-local-business-sales-discounts-shopping-ss-1920Join digital innovators, growth hackers, data scientists and marketing technologists atMarTech Europe: 1-2 November in London, UK. Get a firsthand look from martech pioneers into the implementation, managerial and cultural challenges they tackled whilst accomplishing digital transformation. Register now for strategies and case studies on the intersection of marketing and IT, networking, keynotes and exceptional conference amenities.

Tuesday, 26 July 2016

Is your https setup causing search engine optimization troubles?

Despite the fact that google has said that sites the use of https may be given a minor rankings boost, many web sites have skilled losses because of mistaken implementation. Columnist tony edward discusses commonplace issues and how to fix them.google-https1-ss-1920Google has been making the push for sites to move to HTTPS, and many folks have already started to include this in their SEO strategy. Recently at SMX Advanced, Gary Illyes from Google said that34 percent of the Google search results are HTTPS. That’s more than I personally expected, but it’s a good sign, as more sites are becoming secured.
However, more and more, I’m noticing a lot of sites have migrated to HTTPS but have not done it correctly and may be losing out on the HTTPS ranking boost. Some have also created more problems on their sites by not migrating correctly.

HTTPS post-migration issues

One of the common issues I noticed after a site has migrated to HTTPS is that they do not set the HTTPS site version as the preferred one and still have the HTTP version floating around. Google back in December 2015 said in scenarios like this, they would index the HTTPS by default.
However, the following problems still exist by having two site versions live:
  • Duplicate content
  • Link dilution
  • Waste of search engine crawl budget

Duplicate content

If canonical tags are not leveraged, Google sees two site versions live, which is considered duplicate content. For example, the following site has both HTTPS and HTTP versions live and is not leveraging canonical tags.
HTTP Site Version IndexedHTTPS Site Version Indexed
Because of this incorrect setup, we see both HTTP and HTTPS site versions are indexed.
HTTPS & HTTP Site Versions Indexed
I’ve also seen sites that have canonical tags in place, but the setup is incorrect. For example, Adorama.com has both HTTP and HTTPS versions live — and both versions self-canonicalize. This does not eliminate the duplicate content issue.
HTTP Canonical
HTTPS Canonical
Adorama’s XML sitemap highlights the HTTP URLs instead of the HTTPS versions.

Link dilution

Having both the HTTPS and HTTP versions live, even with canonical tags in place, can cause link dilution. What will happen is that different users will come across both site versions, sharing and linking to them respectively. So social signals and external link equity can get split into two URLs instead of one.

Waste of search engine crawl budget

If canonical tags are not leveraged, and both versions are live, the search engines will end up crawling both, which will waste crawl budget. Instead of crawling just one preferred version, the search engines have to do double work. This can be problematic for very large sites.
The ideal setup to address the issues above is to have the HTTP version URLs 301 redirect to the HTTPS versions sitewide. This will eliminate the duplication, link dilution and waste of crawl budget. Here is an example:
HTTP 301 redirect to HTTPS
Be sure not to use 302 redirects, which are temporary redirects. Here is an example of a site that is doing this. They are actually 302 redirecting the HTTPS to the HTTP. It should be that the HTTP 301 redirects to the HTTPS.
HTTPS 302 redirect
Here is a list of the best practices for a correct HTTPS setup to avoid SEO issues:
  1. Ensure your HTTPS site version is added in Google Search Console and Bing Webmaster Tools. In Google Search Console, add both the www and non-www versions. Set your preferred domain under the HTTPS versions.
  2. 301 redirect HTTP URL versions to their HTTPS versions sitewide.
  3. Ensure all internal links point to the HTTPS version URLs sitewide.
  4. Ensure canonical tags point to the HTTPS URL versions.
  5. Ensure your XML Sitemap includes the HTTPS URL versions.
  6. Ensure all external links to your site that are under your control, such as social profiles, point to the HTTPS URL versions.

Thursday, 21 July 2016

Using spark files for content creation

I’ve only recently starting calling my mess of notes by this name, but for as long as I can remember, I’ve used these types of notes in some form or another. I used to just have a notebook, and then I started using emails to record info, sending one to myself every night and continuing to add to it and reply to it.

That was terribly inefficient and awkward, so I started using the Notes on my iPhone. Then I discovered Evernote — and now that’s my go-to tool as it syncs up across devices, and I can read my notes anywhere.

Since I do more than just link development, I love having an overall spark file for clients where I jot down anything interesting in any way. Sure, it gets pretty big, and I routinely have to go through and edit it, but it’s great to have a place where I can jot down ideas as they come to me.

Sometimes, an idea I have for a link campaign might turn into something I can put into practice for a paid ad on Facebook or Google AdWords. I might find something crazy during a site audit that I want to note as “something to check first!” with my next audit. If a webmaster responds to a link request with information that I hadn’t considered when I did the outreach, I usually note this so I won’t make the same mistake again — those points can be useful in other areas, too.

Sunday, 10 July 2016

What’s new with markup & structured data

Contributor eric enge recaps a consultation from smx superior on dependent information markup in its many bureaucracy.
Structured statistics makes certain forms of internet content fairly handy and understandable via search engines like google and yahoo and different 0.33-celebration applications. Due to the fact the statistics at the page is tagged with standardized figuring out code, it’s a ways less difficult to process and interpret than a regular webpage.

For that reason, people seek advice from this type of facts as “related facts” (similar to the manner that the arena extensive net hyperlinks billions of files together).

At june’s smx superior, aaron bradley did an brilliant activity in his presentation, “what’s new with based data markup?,” presenting an in depth update of what’s taking place in this area.

In case you’re interested in a clearly distinct timeline of all the fundamental happenings inside the world of based data markup, you may get access to that here.

Saturday, 9 July 2016

Google search console computer virus sends re-verification notifications to customers

Google had a trojan horse with their seek console platform this morning, in which it despatched out reverification email notifications to many customers.

The problem become that the verification approach became invalid for a huge quantity of web sites in google seek console, and then google had to reverify the ones web sites. So at some point of that system, it resent verification notifications to the ones customers.

The e-mail notification study, “google has diagnosed that [email] has been brought as an owner of [domain name].” if you got those, google’s john mueller said on twitter no longer to panic — simply check to make sure that each one individuals who are proven in your website should be.

This is probably an awesome element a webmaster need to do on a periodic basis anyway. To achieve this, log in on your search console, click at the gears icon, after which click verification information display screen. Ensure the ones email addresses listed there are users you need to have access in your account.

Here's a display screen shot showing the verification attempts after which the reverification, accompanied by those who have get entry to to the account in search console.

Saturday, 2 July 2016


Believe this nightmare situation: you’re at the verge of launching your newly redesigned internet site, and you’re already waiting for new leads and returning customers. You’ve spent infinite hours working via every closing element earlier than even thinking about unveiling your new introduction to the world. The large day arrives, and you supply the inexperienced light to release.

All of sudden, you recognise you forgot to plot for one crucial detail: the search engine optimization quality practices that you had so cautiously integrated into your old website.

Lamentably, this isn't only a nightmare that can be forgotten once you’ve had your morning coffee, but something i’ve seen happen to countless small organizations over my 10 years because the owner of an seo and online advertising and marketing employer.

Your redesigned website became intended to offer your enterprise a new hire on lifestyles, however as an alternative, you’ve destroyed your organic search ratings and traffic overnight. Whilst you change your website with out thoroughly questioning via the seo implications, you might do something dangerous like throw away great content material or trade each page’s url with out ensuring to redirect the antique ones.

Happily, you can effortlessly keep away from this frightening state of affairs altogether by means of making plans in advance and mastering from the mistakes illustrated within the examples under.

Mistake #1: you added flash-based totally or unoptimized pictures
So, you added a number of large and fascinating pictures on your new touchdown pages within the hopes of making your website online visually attractive. Or perhaps you moved to a greater visual, but much less seo-friendly flash design.

Don’t make the mistake of forgetting to optimize the brand new pictures, otherwise the pages to your new website may load so slowly that capacity clients go out before viewing any of the content material. Relying on flash elements also can motive big troubles for search engine optimization and will simply restrict many mobile customers from viewing the web page.

Within the land of online commerce, endurance isn't always a distinctive feature — these days’s savvy clients are more impatient watching for pages to load. Consistent with radware, clients will abandon a web page inside 3 seconds if it hasn’t loaded.

Take into account an instance from the publishing global. The monetary instances, even as operating on a new edition of its website online, ran an test to recognize how speed impacted user engagement — in particular, the range of articles read through traffic, which is one of the primary ways they degree their fulfillment. They then used this data to calculate the effect on their sales.

What they located became that the rate in their site substantially affected their sales streams, from many hundreds of hundreds of bucks in the brief-term to tens of millions inside the lengthy-time period.

Mistake #2: you left out emigrate important content
Maximum people understand that a successful internet site includes informative and particular content material on every web page, that's particularly targeted to your target market. This consists of the at the back of-the-scenes content material, too, inclusive of descriptive alt text on pictures and meta facts that adds clear information.

Even though a new web site provides a super possibility to replace weak content material, it’s additionally important to transfer over content material this is already tied for your strong natural seek site visitors.

On this real-existence catastrophe story of content material migration long past incorrect, an unbiased software program vendor changed into desperately in want of a new web site layout and was hoping to update its technical content material to create a higher experience for the average user. Even though they attempted to assume thru a way to maintain their existing search engine optimization practices, they proceeded with the update even as making one foremost omission.

At some point of the migration, a area in their cms that mechanically populates as a meta description was became off and as a end result, each unmarried product web page on their new website online turned into lacking a meta description.

Mistake #3: you blocked search engines like google and yahoo from crawling your website online
While a website launch is going terrible, the main failure is commonly due to mistakes that have been made inside the early making plans degrees. Not permitting serps to crawl your new web page is a common mistakes that regularly takes place while sites are moved from the staging location to the stay server.

Perhaps you used robots.Txt to dam search engine crawlers even as the website online changed into in development, but forgot to update the report whilst the web site went live. Or maybe you accidentally placed firewalls in place which can be blocking website online crawlers.

In this example, a webmaster used a wordpress plugin called wordfence to save you bots from crawling the website, in an try and lessen server load and faux referrals. This plugin permits you to whitelist sure bots, allowing them to move slowly the website. He whitelisted several known googlebot ips, but regrettably google switched the ips it changed into crawling from, causing the new ips to get blocked.

At the same time as those new ips have been most effective blocked for 3 or 4 days, it brought on the website online’s net visitors to halt. When the mistake changed into determined, and traffic started out picking up again, it remained slow.

Mistake #four: you didn’t prioritize cellular-friendliness
Closing year, the infamous “mobilegeddon” replace took the seo global by using typhoon whilst google kicked into equipment its mobile-pleasant algorithm.

Even though the impact seemed minimal at the beginning, a later file through adobe determined that the brand new set of rules had as large an effect as feared. Adobe monitored site visitors to over five,000 websites after which split up the outcomes into cellular-friendly as opposed to non-cellular-pleasant. The report discovered that site visitors to mobile unfriendly web sites from google mobile searches declined 12% in just the first  months after the replace.

Additionally, in line with a study from moovweb, while a domain is not mobile-friendly, there are apparent visibility, rating, and usability results. If you centered typically on how your new website online could look on a laptop, you have got inadvertently brought on yourself more harm than precise.

Today’s on-line shoppers are heavy cell customers who will be prepared to bounce in case your website can’t be well study on a cell device. The massive set of rules modifications brought about multitudes of webmasters to trade their web sites in order that they would nevertheless be visible in google’s organic search effects. Up up to now, smaller agencies have taken the biggest hit from mobilegeddon as they war to evolve to the mobile adjustments.