Mobile app version of vmapp.org
Login or Join
Gail5422790

: SEO and APIs: Helping or hurting? I have a couple Real Estate sites for some clients, and I was wondering about APIs and how that would affect SEO. Particularly, I'm looking at using the Yelp

@Gail5422790

Posted in: #Api #Google #Seo

I have a couple Real Estate sites for some clients, and I was wondering about APIs and how that would affect SEO.

Particularly, I'm looking at using the Yelp API for nearby stuff, and possibly a Google (or other) news feed for the particular neighborhoods. Since this information is essentially "duplicated," does this hurt SEO? Or, would the increase in content and cross-linking improve the SEO?

In searching for an answer to this question, it seems that just about everyone agrees that duplicate content is a bad thing for SEO (especially scraped content), but does this change anything since it's actually accessed by my websites in a legitimate manner?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Gail5422790

2 Comments

Sorted by latest first Latest Oldest Best

 

@Murray155

Your question, though a little old, is still relevant, and what started out a question about APIs turned into one about duplicate content. If you asking whether using APIs themselves will hurt your SEO, by themselves they wont. However, depending on the implementation, it could hurt SEO, but it probably still wouldn't be the fault of the API ;)

As for the duplicate content, theres a lot of fluff on the interwebs on the topic. It's not so much that it's bad for all SEO, but a bad thing for search engines in the quest to serve the most relevant content. The problem is that it's everywhere and there's no stopping it (search engines openly recognize this), except where it's most important, which is on your own site. Let me put it like this, A higher authority site can take content from low authority site and can usually outrank it every time. Who would you say got the penalty there? Conversely, If your site has several pages that are the same or deemed close enough, you will indeed get a penalty.

Think about it, some of that content mentioned from yelp, is likely dupe content from the business site itself, as surely other content they pull in from other apis is too. Believe it or not, some sites can and do subsist off of largely all content from other sites aka duplicate content (alltop,dzone,reddit).

Related to the question, I say use the APIs, but just make sure you layer in other content, hopefully unique/relevant keyword rich. Good Luck!

10% popularity Vote Up Vote Down


 

@BetL925

Showing data from another source on your website can cause your site to be penalized for duplicate content. You certainly need original content of your own. You can't build a site with just content from other sources.

Data from other sources that it relevant and useful to your users can be shown alongside your own original content.

When using data from an API there are some things that you can do to mitigate SEO duplicate content risk:


Prevent Googlebot from crawling it by using one of these techniques:

Put the pages that have it in robots.txt or use meta robots noindex tags.
Put the data in iFrames and put the iFrames in robots.txt
Use JavaScript and AJAX to load the data into the page. If you are using widgets provided by other sites, they mostly work this way.

Link back to the original source of the data. It should be a deep link to the page on the site that contains the text and data that you are using.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme