logo vmapp.org

@Nimeshi995

Nimeshi995

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : "Satellite" articles = duplicate content = Bad for SEO? I have a website with a collection of programming articles. Some topics are broad, some topics are narrow. A broad topic generally calls

@Nimeshi995

Posted in: #DuplicateContent #Seo

I have a website with a collection of programming articles.

Some topics are broad, some topics are narrow. A broad topic generally calls for a longer article, while a narrow topic can be just a paragraph.

If a broader topic covers subtopics A, B and C, I'd still like to create three separate "satellite articles" for each topic, because


I'd like to have coverage for search terms specific for A, B and C, and
I don't want my visitors to have to scroll around to find their answer. If they have a question about topic B, it's best answered in an article that precisely targets B.


Think of it like the Main article... references on certain paragraphs on Wikipedia:



At numerous places I've read that content duplication is terrible for SEO. So, my question is:

Does this approach with covering the same topic (sometimes verbatim) hurt my SEO?

10.01% popularity Vote Up Vote Down


Report

 query : GSC 403 Error - Why have some Web pages dropped out of Google serps whilst others have'nt? I have a site that has been on 1st page of Google for years but suddenly it has dropped right

@Nimeshi995

Posted in: #403Forbidden #Error #Googlebot #GoogleIndex #GoogleSearchConsole

I have a site that has been on 1st page of Google for years but suddenly it has dropped right out of search results.

I Checked Google Search Console and sows 39 "Access Denied (out of total 50 pages/posts) - Crawl error showing 403 errors for home page & most of pages/posts - but not all.

Doing a "Fetch" (for home page) just throws up "! Error" - which I presume means Googlebot can't crawl it.

Doing a Server Header check at tools.seobook.com/server-header-checker/ shows:

Requesting www.cheapremovalsburnley.co.uk SERVER RESPONSE:

HTTP/1.1 200 OK
Date: Sat, 10 Mar 2018 09:45:35 GMT
Server: Apache
Expires: Wed, 11 Jan 1984 05:00:00 GMT
Cache-Control: no-cache, must-revalidate, max-age=0
Pragma: no-cache
Link: ; rel="https://api.w.org/", ; rel=shortlink
Set-Cookie: wfvt_2095637327=5aa3a9408dbd8; expires=Sat, 10-Mar-2018 10:15:36 GMT; Max-Age=1800; path=/; httponly
Connection: close
Content-Type: text/html; charset=UTF-8


Which I think looks ok??

I have tried de-activating all plugins, then do another "Fetch" in GSC but still the same "Error"

I can not see any difference between pages that are being crawled and those that are denied.

10% popularity Vote Up Vote Down


Report

 query : HTTPS request connection timeout (NGINX) I tried a lot by reading different answers on Stackoverflow and other forums but unable to identify the error, can someone please check the below configuration

@Nimeshi995

Posted in: #Centos #Https #Nginx #SecurityCertificate

I tried a lot by reading different answers on Stackoverflow and other forums but unable to identify the error, can someone please check the below configuration and tell me what could be wrong?

server {
server_name test.domain.com;
listen 80;
location / {
proxy_pass 176.X.XX.XXX:30805; }
}

server {
server_name test.domain.com;
listen 443 ssl;


access_log /var/log/nginx/ssl_access_test main;
error_log /var/log/nginx/ssl_error_test info;

ssl_certificate /root/ssl/wildcard.domain.com.pem;
ssl_certificate_key /root/ssl/wildcard.domain.com.pem;

ssl_prefer_server_ciphers on;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+3DES:DH+3DES:RSA+AES;

ssl_session_cache shared:SSL:10m;
ssl_session_timeout 10m;

location / {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_redirect ; proxy_pass 176.X.XX.XXX:30805; }
}


I am able to access the page on test.example.com but not able to get it on test.example.com (instead a connection timeout), I also tried to redirect from port 80 to 443 but it didn't help because I tried a lot of different combinations of configurations

10% popularity Vote Up Vote Down


Report

 query : No CAPTCHAs correspond to the "I'm not a robot" checkbox captcha alternative: This is only available if you are using Recaptcha v2. (If your Captcha was setup before October 2014, chances

@Nimeshi995

No CAPTCHAs correspond to the "I'm not a robot" checkbox captcha alternative:

This is only available if you are using Recaptcha v2. (If your Captcha was setup before October 2014, chances are it has been deprecated and doesn't include the No CAPTCHA).
Passed CAPTCHAs including noCAPTCHAs reflects how many captcha tests succeeded.
Failed CAPTCHAs reflect total count of captcha tests that failed as confirmed/submitted to the server (reasons for failure include actually clicking the wrong pictures, typing the wrong code, but could also include failures as a result of corrupted data being passed or an improper API key per the list of error codes at the bottom of the Verifying the User's Response page.
Total Sessions reflects all your requests/connections to the Recaptcha service. If a user chooses to reload the entire page or reset the individual captcha on the page, those would also count as new captcha sessions. It also includes any invisible recaptchas where regular visitors may bypass having to fill in/participate in a recaptcha because of cookies stored on their browsers from their last visit. A session is still requested, but a captcha may not be presented to them.
Passed sessions reflect all successful Captchas and those regular visitors who have bypassed the need to reverify with captcha.
Average Response Time (seconds) reflects the roundtrip time between Google serving the recaptcha to your page and receiving a submission for verification. It is impacted by network latency as well as a user taking his/her time to respond.

10% popularity Vote Up Vote Down


Report

 query : Re: Voice to text duplicate content? Will a direct transcript of a voice to text blog post be seen as duplicate content Do Google web crawlers see this as duplicate?Do they have the tech and can

@Nimeshi995

If you are concerned that an audio file and a transcript of the file would be duplicates, I assure you this is not a consideration. While it is technically possible to transcribe an audio file, this is not a pragmatic exercise. Fears of Google trying to understand everything ignores whether or not it serves any purpose. It does not. Not in search. Not when the web is viewed as an ontology to be understood. Audio and video does not fit this model at all and may never.

Let me explain.

Prior to Google, search engines were fairly simple text based search applications. This all changed when Google appeared on the scene fully intended to be a semantics based search engine. Semantics have existed for decades at least since the early 70's. While the technology at the time was mature, the primary problem with the application of semantics as a technology is the opportunity to apply the technology. Simply put, the ontologies, in this case a collection of documents, were not significant enough in size and scope to apply the technology except on rare occasions.

Enter the web.

Holy snarky SE posters Batman! We may actually have an occasion to use semantics. Admittedly, Google applied semantics in a fairly rudimentary way when it started, however, the application of the technology exploded in ways no one could have envisioned.

That said, it is technically possible to extract text from images, videos, and vocal content from audio. However, since semantics is designed to understand the written word and applied so thoroughly to text as it exists on the web, much would have to change.

Also consider that audio, for example, could contain other sounds as to make a recording difficult or impossible to extract the speech from. As well, with so many recordings, the question becomes, What could be extracted from recording and would it be of value?

Let us for the moment simplify matters. Take extracting text from images as an example. Not many images would have text and of the ones that do, not much text would be extracted. The next question would be, Would the text be of value? For image search? Yes. For document search? No. Why? Because of the high level of noise in the data. Even when the text is clear, there would be little to no value in it. As a signal, keeping in mind that data does not fair well with missing data or nonsensical data, the value of the signal would be nill. Applying this further, audio and video has the same problem. It does not fit the model of applying semantic analysis to the text of a large scale ontology already decades set.

Today, this is not a consideration and therefore audio files are not indexed and cannot create a duplicate content scenario. This may change in the future of course, but not without significant work. Because something is technically possible does not mean it is practical or make sense. Think smellivision for your T.V. The technology has existed for decades but makes no sense at this point. (Humor)

10% popularity Vote Up Vote Down


Report

 query : How to force google to reset sitelinks? Google keeps displaying sitelinks from an old version of my website. Most of them lead to 404s, since the old website is offline. I tried to demote

@Nimeshi995

Posted in: #Google #Sitelinks

Google keeps displaying sitelinks from an old version of my website. Most of them lead to 404s, since the old website is offline. I tried to demote sitelinks individually but other sitelinks from old website are popping back... Weeks go but nothing change, is there a way to tell Google to speed up updating his data?

Also I already created an xml sitemap for the new website.

10.02% popularity Vote Up Vote Down


Report

 query : Re: Should JSON-LD contact markup be placed on a homepage or Contact Us page? I am wondering if JSON-LD contact markup should be placed on a homepage or Contact Us information page, or if both

@Nimeshi995

Both are fine, or either one is fine. As long as search engines can crawl that specific page, they will most likely take note.

Google states in their structured data spec: "Include the contact markup on one page in your official site." (https://developers.google.com/search/docs/guides/enhance-site) Which is to say, one is enough.

Google has stated before that JSON-LD structured data should appear wherever it mirrors the content. So if your Contact Us page has all that info, it makes most sense to include the JSON file on that page. However, your home page may also have this info, or you may have the info in your global footer, so the home page could work too. You won't get penalized for both, but one will be fine. (My personal preference is the Contact Us page, just because it makes sense, particularly if you have your complete info there and not just a web form.)

10% popularity Vote Up Vote Down


Report

 query : Google analytics track product id I have a website where users can add products to favorites. I want to track how many user are using this feature, and later be able to access this information

@Nimeshi995

Posted in: #AnalyticsEvents #EventTracking #GoogleAnalytics

I have a website where users can add products to favorites. I want to track how many user are using this feature, and later be able to access this information to see how this feature is performing using Google Analytics Api.

So I will create an event to track this when a user clicks on the button.

ga('send', {
hitType: 'event',
eventCategory: 'Products',
eventAction: 'click',
eventLabel: 'favorite'
});


Then to see how many favorites has a product received, I will have to filter how many events have been fired in the product url.

The problem, is:


If a user clicks the add to favorites button in a list of products the event is not fired in the product url, so it will not count into the product events.
If in the product page, I add related products, each one with its own button. The related products likes would count in the main product, as they are fired in the product page url.


How would you approach this? Maybe I could add the product id in the eventLabel and change the eventAction to "click-favorite", but I feel that this is not the corret solution.

Thank you.

10% popularity Vote Up Vote Down


Report

 query : How to get such result on google? Sometimes when I google something, I notice that some websites are shown with some several pages as results. Like in the picture here : Does anyone know

@Nimeshi995

Posted in: #Google #Seo

Sometimes when I google something, I notice that some websites are shown with some several pages as results. Like in the picture here :



Does anyone know how to get such results on google search please ? What should I add to my web site to get it appear like this ?

I have been looking on rich snippets but I'm not sure it's this.

Thanks

10.01% popularity Vote Up Vote Down


Report

 query : Navigation bar: How to create a slide effect from the current navigation item to another hovered one - without JS? There's a visually nice slide effect on http://www.gaeste-etage.de/de/ in the

@Nimeshi995

Posted in: #Css3 #Navigation #ResponsiveWebdesign #Svg

There's a visually nice slide effect on www.gaeste-etage.de/de/ in the horizontal top navigation bar:

When you hover with the mouse over another but the current navigation item, the orange rectangle moves to the hovered item.
It is important for the effect, that the current item does not stay orange.

The creator of the website has used a javascript for that. And he added extra markup in form of a div.

My question:
Is it possible to create a comparable effekt just with CSS/SVG and without JS and without additional markup?

I'm working at a website which should be responsive and easy to maintain.
So it is important, that it is not necessary to declare a specific width for each navigation item.

For example I found www.css3create.com/Menu-en-full-CSS-style-Lavalamp
But the author declares a specific width in em for each item.

Thanks for your advice.

10% popularity Vote Up Vote Down


Report

 query : Impact of subdomain urls on SEO I have an application which is hosted on a subdomain URL of my client's primary domain. Below is the example of how my primary domain is and how the subdomain

@Nimeshi995

Posted in: #Keywords #Seo #Subdomain #Url

I have an application which is hosted on a subdomain URL of my client's primary domain. Below is the example of how my primary domain is and how the subdomain is:


Primary domain: example.com
Subdomain: replacephone-carriername.example.com


Now please help me understand below points:


Does my Primary Domain becomes my subdomain's competitor?
The carrier for which my client has created the application, has all
the details of replacephone program and also this new
application URL is linked into this carrier's main portal, and
because of this, the carrier URL also becomes my subdomain URLs
competitor, because we both are fighting for keyword say
"replacephone-carriername" for different URLs.


Please advice me to encounter above 2 challenges.

10.02% popularity Vote Up Vote Down


Report

 query : Re: Explain me some CSS Newbie to CSS. I don't get why :before and :after are used. Can I get some easy explanation,please?

@Nimeshi995

When we use before it means, this css will apply (virtually) before the default content in html element, and vice-versa for after but the catch is these are called pseudo selectors, which does not change the real DOM, i.e. the change would be visible but not real.

Refer this to learn more about it : css-tricks.com/almanac/selectors/a/after-and-before/

10% popularity Vote Up Vote Down


Report

 query : Re: There are lot of hAtom errors in my search console ?? What is this error, I didn't get it Whenever I'm publishing, a new article there is going to a new error by default on my search

@Nimeshi995

Hatom is apart of Hentry which is a microformat for structured data and by default comes enabled in WordPress. It is a common occurrence that WP themes especially those that are 'free' have one or more issues with markup in relation to Hentry, many developers do not test their themes with Google's rich snippet testing tool.

You have 3 choices to fix the issue:


Contact your template designer or change theme.
Edit the template files and try to attempt to fix the issue yourself.
Disable Hentry.


It's a issue experienced by thousands of webmasters all over the globe and you should have no issue finding out further information and solutions to fix the issue.

10% popularity Vote Up Vote Down


Report

 query : How do I get keywords indexed when text is so complexly styled that words are broken up? I was required to implement a web page that has sections where each has a designed title. Basic structure

@Nimeshi995

Posted in: #Html #Seo

I was required to implement a web page that has sections where each has a designed title. Basic structure like the following:



<div class="page">
<div class="section">
<h1>title section 1</h1>
section content
</div>
<div class="section">
<h1>title section 2</h1>
section content
</div>
</div>




My problem is that the web designer made titles visually complex like the following image:


Note that "SO" and "MOS" in the picture are not two words actually but just one word "SOMOS", but that's the design I was asked to achieve.

I replaced the h1 tags with the following HTML structure for each title

<div class="customH1">
<div class="quienes">¿QUIENES</div>
<div class="so">SO</div>
<div class="mos">MO<span class="underlined">S</span>?</div>
</div>


With this structure and CSS I achieved the design so everything is ok so far. My doubt is about SEO. Remember "SOMOS" word? I don't know if search engines will understand the section title "QUIENES SOMOS" considering it's split in three different DIVs. I don't know if search engines will understand "QUIENES SO MOS", "QUIENES SO MO S" or something else than the real title.

I thought about giving a title attribute to the container DIV but I read it just helps for readability and not for SEO.

I also thought about using an image and give it an ALT attribute because I understand search engines consider ALT attribute only for IMG tags but I don't think that is efficient to use images.

I finally thought about using SVG and giving it a TITLE tag as I read somewhere this helps describe the SVG but I don't know if that helps with SEO.



<svg>
<title>I don't know if this will help with SEO while having a nice title designed in this SVG</title>
</svg>




Do you have any suggestions to make my designed titles SEO friendly? or may be my current structure will somehow be understood by search engines already?

UPDATE:
I just used www.seo-browser.com page to crawl my page and I got the following output: "¿QUIENES SO MO S ?"
I don't know if it is safe to think Google will do the same. If it does then I need to fix this.

UPDATE:
At least with Browseo.net there was a difference when deleting spaces between DIVs

<div class="customH1">
<div class="quienes">¿QUIENES</div>
<div class="so">SO</div><div class="mos">MO<span class="underlined">S</span>?</div>
</div>


I hope Google sees it the same way.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Use a single global domain with language subdirectories and language specific URL slugs for Wordpress internationalization We're looking to internationalize a WordPress website, and for multiple reasons

@Nimeshi995

No plugin required

This can easily be done without a plugin and by using WP page templates


Copy header.php to header-spanish.php
Copy footer.php to footer-spanish.php
Copy page.php to page-spanish.php


Edit page-spanish.php


Change get_header(); to get_header('spanish);
Change get_footer(); to get_footer('spanish');
Remove all information within /* */ and add /* Template Name: Spanish Pages */
Edit a page then on the right side look for "template" which is under attributes, then change the page template to Spanish Pages


Now you have a unique header and footer which will give you more control over your different languages.



What about Hreflang Plugins?

You could however use a hreflang manager however this simply controls the <head> but if you want full control then having template pages is the way to go, espeically if you want different sidebars, footers and headers to match the langure of the page.



3 Methods to rewrite URLs with country speific codes e.g /es/


Add a blank page called es with permalink /es/, then make the page noindex, then on your pages that are spanish, simply use parent es which will add the country code to the URL.
Use a rewrite plugin.
Use the WordPress Rewrite API

10% popularity Vote Up Vote Down


Report

 query : How many reviews to include in Schema markup (JSON-LD) on product pages? We have product pages with multiple reviews. Some have 200. 300 reviews. Obviously we do not put them all in the page

@Nimeshi995

Posted in: #JsonLd #SchemaOrg #Seo #UserReviews

We have product pages with multiple reviews. Some have 200. 300 reviews.
Obviously we do not put them all in the page html on page load - we show 5 as default. A person can hit "see more views" - 5 more will be ajax loaded, etc.

How many should we include in Schema markup (JSON-LD)?

I see some not include reviews at all - just aggregate rating. I see some people including 5-10 reviews, I saw one page including 50 reviews.
Theoretically we can include all of them (not to slow down page we can wait for DOM ready and than make data push). I just wonder if it makes sense.

10.02% popularity Vote Up Vote Down


Report

 query : OCSP Stapling still connecting to OCSP server Goal: Decrease the Time to First Byte on a server that uses Extended SSL validation. Possible Issue: I enabled OCSP stapling and ssllabs shows

@Nimeshi995

Posted in: #Openssl #Ttfb

Goal: Decrease the Time to First Byte on a server that uses Extended SSL validation.

Possible Issue: I enabled OCSP stapling and ssllabs shows OCSP stapling is enabled, but tests at webpagetest.org still show the client making one call to the OCSP verification server. Intermittently the communication with that OCSP server increases the ssl connection time to over 1.0 second.

Question: Before enabling OCSP stapling, two connections were being made to the OCSP server, but now there is only one. With stapling enabled shouldn't that eliminate all connections to the verification server by the client? If so how can I troubleshoot the issue?

Edit: I just noticed that ssllabs has "OCSP Must Staple = No", but "OCSP Stapling = Yes" if that matters.

10% popularity Vote Up Vote Down


Report

 query : Tracking where users originally arrived from, across two Google Analytics properties? I have a marketing site (site A), and an application site (site B). The sole purpose of site A is to encourage

@Nimeshi995

Posted in: #Analytics #Conversions #GoogleAnalytics

I have a marketing site (site A), and an application site (site B). The sole purpose of site A is to encourage users to register on site B, via various marketing content.

The sites run different Analytics properties, because I want to see stats about app users (number of sessions, browser type, etc) separately from stats about the marketing site. (NB: Maybe there's an easier way to handle this?)

I have goals set up for registrations and payments on site B. My task is this: I want to track where the users who complete the registration and payment goals on site B originally came from - i.e. before they arrived at either site A or site B. This is so that I can check which ads and other sources are most likely to generate paying customers.

The users' paths to the registration goal could be follows:


Twitter ad -> Landing page on site B -> Register on site B
Third-party blog -> Site A -> Follow link to site B -> Register on site B
Google search -> Site A -> Follow link to site B -> Register on site B


I would like to record their origin points as follows:


"twitter ad X"
"blog post X"
"google search keyword X"


I'd like to record this in my database on registration and I'd also like to send the info to Analytics.

My Twitter ads are set up to use utm_source and utm_campaign on inbound links.

So what's the best way to record where users came from, in order to say "X% of paying customers came from Twitter ads, X% from blog post X, X% from blog post Y, X% from organic search"?

One idea: I could keep the referrer in a cookie, and pass it as a custom dimension to Analytics and also into the database when the user registers? This would also handle the problem of users navigating away from the site and then registering at a later date.

I know Analytics provides utm_source parameters, but how can I ensure that these are tracked between site A and site B? The user may click several pages on site A before going to site B.

10.01% popularity Vote Up Vote Down


Report

 query : Visitors sending cookies that were never set by the host Occasional visitors to my sites send one or more cookies that were never set by my hosts. Sometimes it's an obvious bot. Sometimes it's

@Nimeshi995

Posted in: #Cookie

Occasional visitors to my sites send one or more cookies that were never set by my hosts. Sometimes it's an obvious bot. Sometimes it's even an attempt to exploit. But occasionally it looks like a reasonably legit visit, except for the cookies which are mundane like the date or an IP address.

Are there legitimate reasons why a user agent would deliver cookies to a host which never set them? Do any of the popular human-powered browsers ever do this?

10.01% popularity Vote Up Vote Down


Report

 query : Re: Do you need to credit a thumbnail photo? Say, I use a thumbnail photo, as a smaller or cropped version of the original photo on a site/blog. The thumbnail is used as a preview to the post/article

@Nimeshi995

The above answer from @Stephen Ostermiller is incorrect. ANY use of an image that is unauthorized, regardless of size, is copyright infringement and you would be liable for damages. If you don't believe me, call any copyright lawyer and tell them someone is using your images as thumbnails on their website and you want them taken down and the lawyer will offer to take your case and send DMCA takedown requests. The only instance where you can get away with it is when users of your website are uploading the images and you have a DMCA section on your site that explains what a copyright owner must do in order to have the image removed. This is because your website is crowdsourced. If you, however, are the one posting the image, you are not protected by DMCA. The only reason large websites like Google and Pinterest get away with it is because they have a lot of money to sink into legal fees and most copyright owners don't. Even if the images are crowdsourced, you can be held liable for damages since you are using the image commercially. The only way around this is to not monetize your site.

Source: www.copyright.gov/fair-use/more-info.html
Regarding using the item in a commercial sense, the weight of using it in a commercial sense, and of course determining damages (although, yes, damages are an afterthought), here is an excerpt from the above government URL (my emphasis is bold):


Fair use is a legal doctrine that promotes freedom of expression by
permitting the unlicensed use of copyright-protected works in certain
circumstances. Section 107 of the Copyright Act provides the statutory
framework for determining whether something is a fair use and
identifies certain types of uses—such as criticism, comment, news
reporting, teaching, scholarship, and research—as examples of
activities that may qualify as fair use. Section 107 calls for
consideration of the following four factors in evaluating a question
of fair use:

....

Effect of the use upon the potential market for or value of the copyrighted work: Here, courts review whether, and to what extent,
the unlicensed use harms the existing or future market for the
copyright owner’s original work. In assessing this factor, courts
consider whether the use is hurting the current market for the
original work (for example, by displacing sales of the original)
and/or whether the use could cause substantial harm if it were to
become widespread.


Clearly, the DMCA says that commercial use has an impact on the decision if the impact on the original copyright owner's business without permission is apparent. And what plaintiff wouldn't argue that?

10% popularity Vote Up Vote Down


Report

 query : Re: Why does switching php versions from 56 to 71 in httpd.conf give me a 403 access denied error? (MacOS localhost) Recently installed php71 on my local test environment. I have an index.php

@Nimeshi995

Php7.1 requires the following addition to httpd.conf

SetHandler application/x-httpd-php

When reverting back to php5.6, that line must be commented out, or again, the php code will not be interpreted.

10% popularity Vote Up Vote Down


Report

 query : Re: Does SEO value get transfered when a 301 redirect points to a page that has another canonical? I have this scenario: Page A is 301 redirected to another Page B, which has exactly the same

@Nimeshi995

A 301 Redirect instructs search engines that the content on the requested page (Page A) has been moved to Page B on a permanent basis. As browsers receive this code, you’ll be automatically redirected to the new Page B, mentioned together with the 301 status code. a The 301 redirect code is implemented when your domain is moved permanently, or when web pages have been replaced or moved permanently, or when your web page content is expired, or 404 error wants to point to another, relevant page. The SEO value isn't transferred like adding money to your bank account. The new page will grow it's own value as it is found.

On the other hand, a rel="canonical" attribute doesn’t redirect a visitor to the new URL. Instead, it signals search engines which page to index in SERPs when duplicated content appears within a site. This attribute is used when the content is duplicated or similar on both impacted pages, and both remain visible to visitors. It is also used at the time of content syndication and when it’s impossible to implement 301 redirects.

Using these practices may be good for SEO. Which one you choose depends on your own choice. Remember to consider the impact on your site overall as well as the impacted pages. Review your analytics results before making changes so you do not lose the momentum from popular pages.

10% popularity Vote Up Vote Down


Report

 query : Re: Is Checking For mod_write Really Necessary? Recently I noticed that many people post .htaccess files here with: <IfModule mod_rewrite.c> Sometimes this even appears several times in the file!

@Nimeshi995

It is mostly used by Content Management Systems such as WordPress, since they can operate with PHP rewrite or mod_rewrite, if the mod_rewrite is not supported and you do not use IfModule then some sites will result in error 500.

Every install of WordPress will attempt to use .htaccess and during major upgrades it can trigger a check to see if this code is present:

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress


On several occasions I have witnessed WordPress adding the code AGAIN if their code has been modified. This is why WordPress users will generally leave that code unchanged or use chmod to prevent WordPress appending to it, it's actually a good idea from a security stand point to set your .htaccess to 0444.

This is why I decided to leave it alone when answering one of my questions regarding WordPress HTTP to HTTPS without trailing slash results in double redirect

Summary

Using <IfModule mod_rewrite.c> or not is not going to make any measurable difference in terms of performance for the normal website, if you can remove it then by all means do, it won't make a difference if you do or do not, just roll with what works best for your setup.

10% popularity Vote Up Vote Down


Report

 query : Re: How to get access to Google Tag Manager that was handled by a previous developer? My client site has GTM script included within the <head>...</head> but the client doesn't seem to

@Nimeshi995

Unlike Google My Business and several other Google services that allow customers to prove ownership and regain control using their domain email address such info@example.com, GTM does not.

Unless you have Google Tag Manager 360 then you would need to know:


The Google account used to create the GTM container (email address)
The password or the ability to reset the password.


FREE GTM accounts have no SLA or Support, but if you had a 360 account then Customer Support would be willing to help you and manually delegate a new admin. Paid customers get support and more features.

Easily method would be to start a new container but this time using a Google account operating on the actual domain, e.g info@example.com, that way you always have a method of resetting it.

10% popularity Vote Up Vote Down


Report

 query : Is Google analytics tracking all of my traffic? on all accessible variations/properties I am wondering is Google analytics track all of my properties? https://www.example.com (SSL + www) https://example.com

@Nimeshi995

Posted in: #GoogleAnalytics #Http #Https #NoWww #Subdomain

I am wondering is Google analytics track all of my properties?

www.example.com (SSL + www) example.com (SSL - www) www.example.com (Non-SSL + WWW) example.com (Non-SSL - www)


Example of the GA tracking code I am using:

<script type="text/javascript" >
window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', 'UA-1234567891-0', { 'cookieDomain': 'example.com' } );
// Plugins
ga('require', 'linkid', 'linkid.js');ga('require', 'outboundLinkTracker');
ga('send', 'pageview');
</script>

10.01% popularity Vote Up Vote Down


Report

 query : Re: CDN setup without nameserver AND cname manipulation Is it possible? Does somebody know, how is it technically possible to setup a CDN without nameserver and cname manipulation?

@Nimeshi995

Not supported by all Registrars

If you want the ability to create additional sites or content delivery networks without having to edit any DNS entries then you can adopt a wildcard setup. Sadly not all registrars allow you to do this but a example of a wild card A record looks like this: *.cdn.

Virtual Host

This allows you to create sites, apps and content delivery networks by just editing the virtual host on your server, for example *.cdn.example.com becomes uniquestring.cdn.example.com, this is how Rackspace, Azure and Amazon issue CDN for their customers.

Amazon Route 53

If your current domain registrar does not allow wildcards then take a look at Amazon Route 53

10% popularity Vote Up Vote Down


Report

 query : Re: How to stop Google from indexing pages no longer used URL parameters I made a complete rewrite of a webpage. The previous version was made in PHP and the whole URL scheme was like this: domain.com/?p=subpage

@Nimeshi995

One simple solution is allowing the search engines to crawl the URLs pointing to the web pages with same content. Just mark them as ‘DUPLICATES’ using the rel = “canonical” element, 301 redirect or the URL parameter handling tool. To avoid too much crawling of your site, adjust the crawl rate setting in Search Console. Duplicate content on a site is not penalized unless it appears to Google that the duplication intends to manipulate search engine results. In this case, Google indexes the best version of the content in SERPs.

10% popularity Vote Up Vote Down


Report

 query : Re: Submit to Google without logging in? I want to submit certain web pages to Google so they can be indexed after they have been added or updated, but currently Google seems to be requiring a

@Nimeshi995

Unfortunately, there is no webform or tool (outside of your CMS or some paid enterprise solution) you can visit and enter a page URL and send to Google or Bing, at least not one that does so reliably. Search engines get pinged constantly as it is, and they want to make sure that you at least care enough about your website to set up and verify Google Search Console or Bing Webmaster Tools. Bing does offer a tool you can use to request a re-crawl of your site, but it's a whole site thing, not an individual page, and is still less reliable. (https://www.bing.com/toolbox/submit-site-url)

If you're running on a CMS like WordPress, most sitemap plugins offer a setting where you can ping the major search engines every time your sitemap updates with new content. However, in my experience, it is a weaker signal than the one you send when you submit via GSC or BWT. (I've run experiments where the latter yielded results within a couple of days, white automated sitemap pinging took much longer or produced no results at all, until the whole site was re-crawled anyway.)

If you don't want to potentially wait several days to be re-crawled, the best course of action is to log in and submit your new URL via the preferred, official sources or get someone else you trust to do it for you.

10% popularity Vote Up Vote Down


Report

 query : Re: Why is this site still down when the domain was renewed almost a week ago? The domain of interest was renewed a week ago, but the site is still down. Shouldn't the site be back up by now?

@Nimeshi995

There is something called a domain lifecycle and it goes as follows.


The domain is available.
(While a domain name is in the 'Available' stage it can be registered by any person or organization via a domain registrar that operates under the tld authority.)
The domain is active.
(When a domain is in this phase it means that it has been already registered and it is fully functional. Domains of the most common extensions can be registered for minimum 1 year and a maximum period of 10 years. During this state, the domain can be renewed any time but its maximum life period cannot be extended to a period longer than 10 years.)
The domain has expired.
(Being in this phase the domain will no longer work and any website services via the domain will be no longer accessible via it. The domain will stay in this phase for about 45 days and can be renewed only by its owner. If the domain is renewed it will go back to the Active phase.)
The domain is in Redemption/Grace Period.
(If a domain name is not renewed in the 45-day-period while being in the Expired state it will go into Redemption phase. Redemption period might vary depending on your domain registrar or domain extension but usually lasts for up to 30 days. During this period the domain can be renewed only by its owner but additional fees will be applied depending on the registrar to reactivate the domain.)
The domain is pending Deletion.
(In case you do not renew your domain name during the Expired or Redemption phase your domain will go into the Pending Deletion stage. Domains in Pending Deletion cannot be renewed and usually stay in this phase for up to 5 days until all records for the domain are removed from the domain authority zone.)
A. The domain is available.
B. The domain goes onto an auction
(Depending on the original domain registrar, the domain may be released and set back to Available status after the Pending Deletion phase. Still, some registrars might withhold the domain or/and list it on domain auction websites.)


Source: www.fastcomet.com/tutorials/getting-started/domain-life-cycle

10% popularity Vote Up Vote Down


Report

 query : Re: Setup adwords cross-domain(subdomain) conversion tracking How can i setup cross domain conversion tracking with adwords? My main site is "www.mainsite.com" My conversion goal is visit "apply.mainsite.com"

@Nimeshi995

Adwords does not support cross domain tracking however Google Analytics (AKA GA) does. Simply link your adwords account to your GA account. Below are quotes from Google's Answer pages, they are snippets and for full information you should visit each source.

Link Adwords With Google Analytics


SOURCE

Link AdWords and Analytics

The linking wizard makes it easy to link your AdWords account(s) to multiple views of your Analytics property. If you have multiple Analytics properties that want to link to your AdWords account(s), complete the linking process for each property.


Sign in to Google Analytics. Note: You can also open Analytics from within your AdWords account. Click the Tools tab, select Analytics, and then follow the rest of these instructions.
Click Admin and navigate to the property you want to link.
In the PROPERTY column, click AdWords Linking.
Click + NEW LINK GROUP
Select the AdWords accounts you want to link, then click Continue.
Turn linking ON for each view in the property in which you want AdWords data.
Optionally, select Enable Google Display Network Impression Reporting to also include that data in each view.
If you've already enabled auto-tagging in your AdWords accounts, or if you want to let the linking process automatically enable auto-tagging in your AdWords accounts, skip to the next step (9). However, if you want to manually tag your AdWords links, click Advanced settings > Leave my auto-tagging settings as they are.
Click Link accounts.


Congratulations! Your accounts are now linked. If you opted to
use auto-tagging (recommended), Analytics will start
automatically associating your AdWords data with customer clicks.


Setup Cross Domain Tracking in GA


SOURCE

Set up cross-domain tracking (analytics.js)

Cross-domain tracking makes it possible for Analytics to see sessions on two related sites (such as an ecommerce site and a separate shopping cart site) as a single session. This is sometimes called site linking.

To set up cross-domain tracking, you'll need to be comfortable editing HTML and coding in JavaScript, or have help from an experienced web developer.

Set up cross-domain tracking by modifying the tracking code

To set up cross-domain tracking for multiple top-level domains, you need to modify the Analytics tracking code on each domain. You should have a basic knowledge of HTML and JavaScript or work with a developer to set up cross domain tracking. The examples in this article use the Universal Analytics tracking code snippet (analytics.js).

ga('create', 'UA-XXXXXXX-Y', 'auto', {'allowLinker': true});
ga('require', 'linker');
ga('linker:autoLink', ['example-2.com']);



E.g use linker using ['shop.example.com', 'www.example.com']);

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme