logo vmapp.org

@Cody1181609

Cody1181609

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Re: Should 404 pages feature in your sitemap.xml I tend to generate the sitemaps for my static sites (no CMS) by using a service such as xml-sitemaps.com. They way this works if you input your

@Cody1181609

I would say yes but not because you want them indexed but instead removed from the search results.

If you add the required noindex tag or HTTP header to those 404 pages, you will be informing the search engine (Google in this case) that those links should lose their place in the search results.

In Google's words:


When Googlebot next crawls that page and see the tag or header,
Googlebot will drop that page entirely from Google Search results,
regardless of whether other sites link to it.


Source: support.google.com/webmasters/answer/93710?hl=en

10% popularity Vote Up Vote Down


Report

 query : How to create a Segment of just pages that are in a particular Content Grouping? I'm trying to view Landing Pages for visits that included viewing pages in a group of pages (a Content Grouping).

@Cody1181609

Posted in: #GoogleAnalytics

I'm trying to view Landing Pages for visits that included viewing pages in a group of pages (a Content Grouping).

So I want to see a list of Landing Pages and the number of sessions that started from each, where the Session included a visit to a page in the Content Grouping.

That way I can see how many visits to a Content Grouping page were generated by each landing page.

What I tried

I think the way to do that is to view Landing Page but use the Content Drilldown's Filter feature (see below). But I can't see how to do that. The only option I see is Filter : Content Group contains (or is or matches, etc.) some value.



But I don't know what you'd use for that value to check against. (see screenshot)

10% popularity Vote Up Vote Down


Report

 query : Will link juice still be passed if you have the same links in multiple, outreach articles? We are developing high quality, unique content and bloggers are hosting these articles. In these articles

@Cody1181609

Posted in: #Backlinks #GoogleSearch #Seo

We are developing high quality, unique content and bloggers are hosting these articles. In these articles we have several links to 2 to 3 sites. While the links are completely relevant, each article points to the same 2 to 3 sites. The link text varies slightly from article to article, but the linked-to site/URLs remain the same.

We have read that it is best to have 2 to 3 external links, not all pointing to the same site. We have followed this rule, but the 2 to 3 external sites are the same sites on the other articles. I'm having a hard time explaining this, so I hope this makes sense.

According to a comment by Simon Haytor on this question, this link setup describes a Link Wheel, and the site(s) will be penalized by Google.

My concern is, will Google see this as a pattern and link juice won't be passed to the linked-to URLs?

Also, can someone confirm that Simon's comment is accurate? That is, is what I described above considered a "link wheel" and the site(s) doing the linking and/or being linked to will be penalized by Google?

10% popularity Vote Up Vote Down


Report

 query : How to tell Google I'm doing a search from another city than mine Based on the assumption that the search results of a query on Google vary from a location to another, I need to test the

@Cody1181609

Posted in: #GoogleSearch #Seo

Based on the assumption that the search results of a query on Google vary from a location to another, I need to test the search results for the website of a shop in a city far from mine. I can find the shop I’m looking for in the first page results if I add the name of the city as a keyword, but I am pretty sure it appears without this if a user from around this city does the search with only the keywords.

How can I test that? I would like to pretend, using Google, that my location is another city than the one it has identified for me.

10.02% popularity Vote Up Vote Down


Report

 query : How to make new domain redirect to specific page of main domain without losing SEO? So I have a exchange office (lets say www.example.com) Wordpress website and I bought a new domain just to

@Cody1181609

Posted in: #Nginx #Redirects #Seo

So I have a exchange office (lets say example.com) Wordpress website and I bought a new domain just to sell Euro (www.euroexample.com). I want this new domain to show the specific page for Euro exchange in the main domain. What is the best way to do this without affecting SEO? It is now working with a proxypass on Nginx. When I access the euroexample.com, it shows the content of example.com/euro. Is this a good idea? It doesn't sound like one.

server {
listen 80;
server_name euroexample.com euroexample.com; location / {
proxy_pass example.com/euro; proxy_set_header Host euroexample.com;
}
}

10% popularity Vote Up Vote Down


Report

 query : Re: Even nofollow links can hurt my website? I have a forum with around 20,000 organic traffic with 100+ forum threads and 1000+ answers. Many users already have requested that allow them to use

@Cody1181609

Marketing and general people advice you do not worry about nofollow outbound links, but matt cutts said nofollow link sculpt your pagerank (check 4th and 5th Question). I don't know weather it is still true or not, but wikipedia working fine with tons of nofollow outbound link. So It is not like it will affect badly.

If your forum have quality good content contributed mostly by your user, then allowed them to have signature. But I will probably add some limits like if user have more than 50 post then allowed signature, but if they simply post just to reach your limit, then probably I will go for manual approval.

10% popularity Vote Up Vote Down


Report

 query : Re: Are there only 12 directives for ? As far as I can tell, in 2017, the <meta name="robots" content="index, follow"> accepts the following directives in the content attribute: index follow

@Cody1181609

There would seem to be only 8 mainstream directives for the robots meta tag (or X-Robots-Tag HTTP response header) that really matter. Google includes 9 directives in its list of valid directives, including all, which is arguably just ignored.

Note that Google does not include index and follow in its list of supported directives. These are most probably just ignored, as I would expect any "unknown" directive to be.

Google lists the following supported directives (which are already included on your list):



all - There are no restrictions for indexing or serving. Note: this directive is the default value and has no effect if explicitly listed.
noindex - Do not show this page in search results and do not show a "Cached" link in search results.
nofollow - Do not follow the links on this page
none - Equivalent to noindex, nofollow
noarchive - Do not show a "Cached" link in search results.
nosnippet - Do not show a text snippet or video preview in the search results for this page. A static thumbnail (if available) will still be visible.
notranslate - Do not offer translation of this page in search results.
noimageindex - Do not index images on this page.
unavailable_after: [RFC-850 date/time] - Do not show this page in search results after the specified date/time. The date/time must be specified in the RFC 850 format.






Are there any other directives which might be included in a fully comprehensive list?


There could be any number of "other" directives, that are (or were) supported by "other" bots. The syntax is designed to be extended and used as required. However, unless you need to support some "other" bot (which you would probably already be made aware of the specific directive to use) then the de-facto standard is probably that stated by Google above.

10% popularity Vote Up Vote Down


Report

 query : Google Analytics ranking-(loss) because of domain suffix? I have a general question about Google Analytics. Unfortunately I have not found a satisfactory answer to this on the internet. My main

@Cody1181609

Posted in: #Analytics #GoogleAnalytics #GoogleRanking #Ranking #Seo

I have a general question about Google Analytics. Unfortunately I have not found a satisfactory answer to this on the internet.

My main domain looks like this: maindomain.de/de
Can it basically lead to a ranking loss, because the main domain ends with "/ de"? How does Google evaluate such domains?

I came to this question because I made a backlink check and saw that none of them lead to the domain ending with "/ de". Instead, they end up on the domain without "/ de".

I hope you understand my question and can help me :)

10% popularity Vote Up Vote Down


Report

 query : When to split content pages for SEO keyword relevancy and ranking There are times when we want to rank a single page for multiple similar keywords. For example, take the following page: https://example.com/organic-apples-ben

@Cody1181609

Posted in: #DuplicateContent #GoogleSearch #Seo

There are times when we want to rank a single page for multiple similar keywords. For example, take the following page:
example.com/organic-apples-benefits
This page would discuss the benefits of organic apples over conventional apples.

Some of the keywords we might want to rank for include, in priority order:


Organic Apple Benefits
Are There Any Benefits To Organic Apples
Cost Benefit Organic Apples


All of these terms would be relevant to this page but we can only have one title tag and one h1 tag and we can only optimize those tags for just one of the terms.

In this case, should we create 3 different pages each focusing on a specific term? We are hesitant to do this as this will make our content light or worst, a duplicate of the other pages.

Or should we optimize the title and h1 tags for the first term since it's the most important, then use the other terms in h2 tags and body copy? I realize that there are many factors to consider, especially related to competition, but assuming all these keywords have low competition, is this a sound strategy vs splitting the terms into multiple landing pages?

10.03% popularity Vote Up Vote Down


Report

 query : Re: How Late is Too Late to Redirect Pages of Domain for SEO Our company recently had a new website designed for them to replace the old one that was literally made sometime in the mid-90's.

@Cody1181609

Firstly, it’s definitely not too late to sort out the redirects. As others have mentioned search engines will periodically crawl old URLs and pick up any changes and update where appropriate. I’m actually working on fixing some redirects at the moment that are over 3 years old

From the situation you describe it is quite likely that most of the problem is the fact that you are redirecting the old content pages to the homepage of the new site rather than doing one to one mapping where possible to the same or similar content on the new URL

Google provide a good guide on the steps of moving to a new domain which obviously you would normally go through during the process but it can certainly be used in retrospect as well.
support.google.com/webmasters/answer/6033085?hl=en&ref_topic=6033084
Although it sounds like there have been issues with this migration. even with all content mapping and redirects correctly set up it is quite normal to see some fluctuations in the first month or so after a move.

Also bear in mind that there are a lot of other factors that can have an impact such as page layout, quality of code, site performance, whether content is being dynamically served, internal linking etc and all of these can also impact rankings.

Google have some fairly clear guidelines on what they are looking for in a good site in terms of general performance, quality and content
support.google.com/webmasters/answer/35769?hl=en
If you are able to post the old and new URL’s then we might be able to see if there are any other issues with the new site.

10% popularity Vote Up Vote Down


Report

 query : Adding another domain SAN to a shared hosting server's multi-domain SSL certificate? I am familiar with simple DV certificates, but not any of the other fancier stuff - like EV, wildcard and

@Cody1181609

Posted in: #MultipleDomains #Security #SecurityCertificate #SharedHosting

I am familiar with simple DV certificates, but not any of the other fancier stuff - like EV, wildcard and Multi-domain SSL certs.

As the title suggests, I'd like to know what the process is for (and how much extra does it cost) to add an additional domain name to an existing shared hosting server's multi-domain SSL certificate?

Long story short, is that an association I belong to has a website (CMS) thru a vendor and it is not secured by HTTPS. All members' login and data is sent in the clear. I raised the concern about not being secure, and a few days later the manager of the club informed the company/webmaster of the club's site wants to charge us an additional /mo. or 0 per year, just for the privilege of having secure communications with their app/website (it's their SaaS).
I did some digging and realized the company providing the club's website, has our site on a shared hosting server, along with other domain sites that use the same CMS/SaaS, and that some but not all the domains/sites on that server are in the multi-domain SSL certificate.

Personally, in the age of free/cheap (certbot/letsencrypt) SSL certs, I think this is a scandal, specially considering that I'm sure it doesn't cost much to add our domain to the multi-domain cert they are already using. If it helps any, the SSL they have is with Comodo.

10.01% popularity Vote Up Vote Down


Report

 query : SEO impact of repeating introduction text on parent page I've been asked to add introduction text from a service page to a section on a parent page that list overview for that service and

@Cody1181609

Posted in: #DuplicateContent #Seo

I've been asked to add introduction text from a service page to a section on a parent page that list overview for that service and any sibling pages of that service.

I'm been instructed to use exactly the same same text from the service first pargaraph.

I'm thinking that this will more than likely impact seo, with the cause being duplicate content.

Is there a way I can get around potential duplicate content issues, whilst using the exact same text.

10% popularity Vote Up Vote Down


Report

 query : Should my canonical URLs have only the ID with no slug so that they don't change? Our URLs will be created by the user's content. Eg: www.example.com/bar-1234/i-went-to-baz-4321 The "i went

@Cody1181609

Posted in: #RelCanonical #Seo

Our URLs will be created by the user's content. Eg:
example.com/bar-1234/i-went-to-baz-4321
The "i went to baz" portion is user generated, included with the aim of being helpful to the person viewing the URL and also to possibly contain useful keywords for search engines.

However, the string that forms that portion is user editable. So it could become:
example.com/bar-1234/we-drove-to-baz-4321
AFAICT the canonical URL should be:
example.com/bar-1234/4321
The string portion is discarded by the controller, which means that there's potential for duplicate content from limitless URLs!

So, my question is, does it matter if my canonical URL is 'non SEO friendly'? It seems logical to me to provide search engines with the "id only" version of the URL which is always consistent, but the canonical URL is the one that appears in search results, so is there any benefit from trying to use more 'SEO friendly' URLs?

10% popularity Vote Up Vote Down


Report

 query : Re: How do you add a rule just for a specific bot to robots.txt? I have a small website, for which the current robots.txt looks like so: User-agent: * Disallow: Sitemap: https://www.myawesomesite.com/sitemap.xml

@Cody1181609

By default, all bots are allowed to crawl all parts of your site with or without the first code you provided.
So to disallow one bot just add the second code:

User-agent: SomeStupidBot
Disallow: /


and remove:

User-agent: *
Disallow:


I am not an expert in robots.txt, but from my understanding this is how its done.

10% popularity Vote Up Vote Down


Report

 query : Re: When and how does Google Webmasters let you know a "Change of Address" has been successful? I migrated my blog from one domain to the other and initiated a change of address using Google Search

@Cody1181609

I'm not entirely sure what you mean by "change of address." What I understand you is to say that you've moved your blog from Domain A to Domain B? What do you mean by Google being done with the change of address? Usually Google isn't involved in that process.

Now, if you mean to say that Google's Webmaster Tools have been updated to analyze your new domain, and you're seeing stats coming in, then it's likely working and done, since you 301'd your content. Making updates shouldn't be an issue, since they're going onto the new domain, anyway.

Otherwise, the best way to be sure is to either:

1) Shut down the old domain or...
2) Remove the old blog or access to it from the host

...and then see if you have stats trickling in.

10% popularity Vote Up Vote Down


Report

 query : Re: Versioned pages: making sure Google points to the current The open-source Sinon project has some docs that are versioned per-release. That means that when searching for sinon documentation, you

@Cody1181609

Good Way


Links to your newer version documents from as many location you can. The higher links means the higher position it will get. So you can point to your newer version docs from old docs pages, like this is outdated page, please refer new version. You can also point from sidebar/homepage/other-important pages which also have good amount of incoming links.


Bad Way


If those old pages are not important for search users, means it does not solve user query, then you can use canonical link tag. So If your old docs contain canonical link tag which point to newer version of docs , then those old docs will disappear from search results. It meas google will de-index your old docs but still user can access it. So this is better option than redirection.


Ugly way


You can use rel prev or next markup as well, which mostly give high priority (It's purely based on the search results I seen personally on Google) to first webpage. Actually it is mostly used for pagination only, but you can use for your docs version as well. The plus point of this way is, Google can still index and show your old docs on the search results.

10% popularity Vote Up Vote Down


Report

 query : Re: Is it possible to Host single website on multiple server? I have a file sharing website . It's hosted on a dedicated server which have 16gb RAM and 2tb storage . Now I need more storage

@Cody1181609

Its better you can use AWS EFS Service or S3 to store your contents. If you still want to go with same and need solution then you can take another small server with 2TB HDD and install NFS to mount that HDD in your existing Server. So, You can able to use 4TB HDD from existing machine.
Let me know if you need further help on this.

10% popularity Vote Up Vote Down


Report

 query : Structured data for social media profile doesn't show up under "Rich Cards" I'm trying to use rich cards to get Google to show my social media profiles in its "knowledge panel". I've followed

@Cody1181609

Posted in: #GoogleSearchConsole #KnowledgeGraph #RichCards

I'm trying to use rich cards to get Google to show my social media profiles in its "knowledge panel".

I've followed Google's instructions for doing this but I can't get it to work. In Search Console the data I've used shows up under Structured Data and shows no errors, but doesn't show up under Rich Cards.

The code I've used:

10.02% popularity Vote Up Vote Down


Report

 query : Goddady Domain delegated to wix: add subdomain to No-IP I have purchased a domain in GoDaddy and delegated to my client's website created in Wix and is working OK. Now my client wants to add

@Cody1181609

Posted in: #Dns #DynamicDns #Godaddy #Wix

I have purchased a domain in GoDaddy and delegated to my client's website created in Wix and is working OK.

Now my client wants to add some subdomains that points to a local machine on his network using NO-IP.

I need to delegate that subdomain to No-IP and keep the main site in Wix.
Is that possibly?

10.01% popularity Vote Up Vote Down


Report

 query : I've just purchased a domain and getting cold called I've just bought a domain and I've just been cold called 7 times today. I would like to know what companies are selling new registrations

@Cody1181609

Posted in: #Domains

I've just bought a domain and I've just been cold called 7 times today. I would like to know what companies are selling new registrations as leads? I am aware of privatising my information but I'm not paying £4 per annum for the right to be anonymous.

10% popularity Vote Up Vote Down


Report

 query : Environment specific client-side config file I'm developing a WebRTC app where the endpoints along with some other options are different for production and development. I was handed this project

@Cody1181609

Posted in: #Clients #Configuration #SiteDeployment

I'm developing a WebRTC app where the endpoints along with some other options are different for production and development. I was handed this project so one of codebases uses Nodejs and the other uses plain html, css and js. I know that Nodejs has some modules that provide environment specific configs (e.g. Config) but they are all server-side.

My current solution is to have a config.js file in both branches which is statically served to the client. This however doesn't feel like best practice, and managing them with Git is pretty sketchy since I want the config.js file to be included in the repository but I don't want to have them changed when merging branches.

What are the most common ways to pass environment specific configuration options to the client in a web app?

10% popularity Vote Up Vote Down


Report

 query : Will adding 100 meta keywords to the same page hurt or help SEO? Is it possible to add several keywords to the same page? Will this penalize my page by Google? Example: What is the implication

@Cody1181609

Posted in: #Keywords #MetaKeywords #MetaTags #Seo

Is it possible to add several keywords to the same page? Will this penalize my page by Google?

Example:



What is the implication in adding many keywords? I am expecting I get traffic from all keywords, if it is 100 or 1000000.

Also I would like to know, meta-name should be keyword or anything I can put there like <meta name="anything"?

10.04% popularity Vote Up Vote Down


Report

 query : Re: Does Hashtag in the URL affect SEO rankings for as Angular JS web application I am using Angular JS to develop my web application. My current URL will be like this: localhost:8080/#!/home. I

@Cody1181609

Google just don't understand what the meaning of /#!/ or any other thing like /10787/ in this question URL, but they can crawl the hashbang /#!/ URL just like other URL's.

I suggest to go for clean URL's, if it take only few hours to setup, it will impress your visitors as well. The keyword in URL affect very very small in ranking, but the position of keyword like after /#!/ or /10787/ or after 3 sub directory does not make much difference.

10% popularity Vote Up Vote Down


Report

 query : Cannot "fetch as Google" in Search Console after implementing HTTPS because of redirect We have recently switched from HTTP to HTTPS. When I tried to "fetch as Google", it says the URL has

@Cody1181609

Posted in: #301Redirect #FetchAsGoogle #GoogleSearchConsole #Https

We have recently switched from HTTP to HTTPS. When I tried to "fetch as Google", it says the URL has been redirected and I am not able to see how the website is going to look.

Is it because Google Search Console was setup for example.com not for example.com?

If this is the issue then do I need to add a second property to Google Search Console for the domain with HTTPS? What do I do with old authorization meta tag?

10% popularity Vote Up Vote Down


Report

 query : Re: Main Domain & Subfolders to Main Domain and Subdomains (Search Console) Hi want to know we launch a new redesign of site (multi regional) we went from one main domain and subfolders to main

@Cody1181609

If sitemap contains some URL and it redirect to somewhere else then it will not going to affect at all, so keep your main sitemap as it, it will not going to harmful, may be it will help you to index your new subdomain content faster.
Just implement proper 301 redirection, no matter how your sitemap contain any URL's but with proper 301 redirection, google will overwrite it. so make sure example.com/some/page/ redirect to some.example.com/page/. If you implemented proper 301 redirection on every URL then it will never return 404 error page. So please check your 301 redirection again.
As I said sitemap does not harm neither it is too useful, it just help Google to discover new content. So when Google see any new URL in sitemap, it will start crawling in their schedule time. So create n number of properties on search console based on n subdomain like sub.domain.com and subn.domain.com then create individual sitemap for your subdomain, and submit it.
Make sure your subdomain sitemap contain only those URL's which is really exist. I think, here you're doing some mistake. If Google see any URL in your sitemap, and it does not point to any real webpage, then Google will notify you about that 404 URL. So make sure your subdomain sitemap is real, and points to correct URL.


The main thing you should do is proper 301 redirection, if it is good, then you're fine. And check your 404 error, it will also tell you the source from where it is linked.

10% popularity Vote Up Vote Down


Report

 query : What are the risks of signing up with a web host using a domain attached to a live site? I have a live site with an existing web host; the domain is reg'd with an independent third party.

@Cody1181609

Posted in: #Dns #Live

I have a live site with an existing web host; the domain is reg'd with an independent third party. I would like to sign up with a different host, but this host requires me to provide a domain for the new site. I'd like to use the existing domain, but I don't want any changes to DNS or downtime on the existing site, while I build the replacement at the new host. Why is the new host requiring a domain? They offer a temporary URL / IP-based access for new sign-ups, so I don't understand why, strictly speaking, a domain name is needed for sign-up.

( This -- If I use a web hosting service with a custom domain name then later want to move to a different host, will I be able to use that same domain name? -- was the closest thing I could find, but doesn't really answer my question ).

10.01% popularity Vote Up Vote Down


Report

 query : Removing content in specific directories from Google search results We are using a plugin on our website to handle job application submissions. I found out recently that the applications were

@Cody1181609

Posted in: #Google #GoogleSearchConsole #RobotsTxt #SearchResults

We are using a plugin on our website to handle job application submissions. I found out recently that the applications were being uploaded into a public directory. We've had some individuals approach us asking us to remove there resumes from our system since they can be found doing a simple search on Google.

I've taken some steps to protect the information in these directories:


Used robot.txt to Disallow Google from accessing the directories.
Password protected the directories.


It's been two weeks since I've put these measures into place yet I'm still seeing files from these directories in Google search results.

What can I do to ensure that the files that are in these directories no longer appear in Google's index?

Thanks
-Brandon

10.01% popularity Vote Up Vote Down


Report

 query : Re: If .htaccess is used to block my bot from accessing a particular directory, will I know this? I'm working on a research project and I have a question. Say I would like to crawl all pages

@Cody1181609

I would need to know for sure that it has been blocked.... I don't want my bot to be blocked in a deceptive manner....


It's not really possible to know "for sure" (ie. 100%) whether your bot has been blocked, if it has been blocked in a "deceptive manner".

The site could theoretically return a 200 OK status and what looks like a valid response body yet you have still been "blocked" from seeing the intended content. In order to detect this kind of "block" you could perhaps compare the response you get with a "known valid response" for a "non-blocked" request. But how do you get that "known valid response" and what if the expected response is dynamic in nature?

Google must do something of this nature in determining "cloaked" responses (when Googlebot is served something different to what an ordinary user sees) - but I very much doubt this is 100%.


If .htaccess is used ...


Why the mention of .htaccess? I would have thought that the exact method used to block the bot is irrelevant? But anyway, you could still block a bot "deceptively" with .htaccess alone.

10% popularity Vote Up Vote Down


Report

 query : Re: Do I have to provide email and an agent for Copyright Complaints? I have a website and it works better for me to just have an online Copyright Complaint form, under DMCA Notice of Alleged

@Cody1181609

There is no specific requirement on how to receive takedown notices as long as third party authors can send you notices easily. Dropbox, for example, has an online form:



While 500px uses an email address only:



It's worth reading the current best practices on DMCA takedown notices (we researched DMCA here):


Make your process clear
Provide multiple links to your DMCA section, i.e. in your Terms agreement, in the footer of your website, in a FAQ section of your website.
Provide your contact information or a contact form
Help third parties authors to send you a complete report, i.e. users should provide the URL with the infringing content to you. Consider Dropbox's example from above.

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme