logo vmapp.org

@Ravi8258870

Ravi8258870

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Many websites miss the HTML id or name attributes (no support for fragment URI's), what to do about it? I like the idea that HTML header elements have the id attribute. It allows to share

@Ravi8258870

Posted in: #Browsers #Html #Hyperlink #Navigation

I like the idea that HTML header elements have the id attribute. It allows to share links with fragments like for example with Wikipedia:
en.wikipedia.org/wiki/Fragment_identifier#Proposals
However, may top sites do not have id attributes, at least not with their header elements, like for example


Google search results www.reddit.com/ https://www.linkedin.com/help/linkedin/forum www.raspberrypi.org/blog/

has almost no id attributes. This makes the HTML look clean and tidy, but much less navigable.

Why is this (these days) and what to do about it? Is there another way to point a browser to a specific portion of a potentially large page?

10.02% popularity Vote Up Vote Down


Report

 query : What needs to be done to user generated outbound links to prevent hurting SEO? I have a site that contains user/external application generated content (content from SMS messages). I want links

@Ravi8258870

Posted in: #Links #Nofollow #Seo #UserGeneratedContent

I have a site that contains user/external application generated content (content from SMS messages).

I want links to be clickable for user friendliness, but I don't want any potentially "spammy" links affecting my SEO negatively. Is adding rel="nofollow" enough, or should I take other precautions?

10.03% popularity Vote Up Vote Down


Report

 query : Angular4 and meta descriptions, google serp SPA or angular4 websites uses a shadow dom, so, how to use or show the correct meta description added to the page? Do I need this: <meta name="googlebot"

@Ravi8258870

Posted in: #AngularJs #Googlebot #GoogleSearchConsole #Seo #Webmaster

SPA or angular4 websites uses a shadow dom, so, how to use or show the correct meta description added to the page?

Do I need this:

<meta name="googlebot" content="code">


or

name="googlebot" content="noodp"


As google change the meta descriptions logic, we're getting weird results and even wrong pages on SERP that are not related to the search.

Some case the description on serp is related to the query, but the title and the URL of the page shown by google is not and the weird thing is that the description showing in serp is from another related product or article in those pages

Any ideas?

Seems google doesn't show correctly the meta descriptions for SPA websites or latest angular version

How to show the meta description added to the code?

10% popularity Vote Up Vote Down


Report

 query : Submit to Google without logging in? I want to submit certain web pages to Google so they can be indexed after they have been added or updated, but currently Google seems to be requiring a

@Ravi8258870

Posted in: #GoogleIndex #Indexing #SearchEngines

I want to submit certain web pages to Google so they can be indexed after they have been added or updated, but currently Google seems to be requiring a web master to login to do this.

I do not want to login to Google to submit my pages.

Is there any way to submit web pages to Google for indexing without logging in?

10.01% popularity Vote Up Vote Down


Report

 query : Google reports that URLs in news sitemap aren't on a verified news site I have a News site with Bengali and English. I have done Google News inclusion and it is getting indexed. But Search

@Ravi8258870

Posted in: #GoogleNews #GoogleSearchConsole #Seo

I have a News site with Bengali and English. I have done Google News inclusion and it is getting indexed. But Search Console is showing 2 errors about the sitemap.






Your Sitemap is on a site that is not in the Google News database. Google News can only accept Sitemaps from sites that we crawl. If your site is crawled by Google News, please check that the URL of your Sitemap agrees with the URLs of your articles as they appear on Google News, including any leading "www". If you would like to request inclusion of your site in Google News, please contact the Google News support team.


What am I supposed to do to fix this?

10.01% popularity Vote Up Vote Down


Report

 query : Should the escaped fragment URL or the hash bang URL be used with Fetch as Google in Search Console? which one url must be added into url input field of Google search console-> Crawl->Fetch

@Ravi8258870

Posted in: #Ajax #GoogleSearchConsole #Url

which one url must be added into url input field of Google search console-> Crawl->Fetch as Google?

example.com/?_escaped_fragment_=my-ajax-page


or

example.com/#!my-ajax-page

10% popularity Vote Up Vote Down


Report

 query : Using Product Structured Data when not selling online I'm currently working on a site that sells expensive but very niche products that have to be specified to a certain specification before

@Ravi8258870

Posted in: #Seo #StructuredData

I'm currently working on a site that sells expensive but very niche products that have to be specified to a certain specification before being made to order. We will never sell these products to the public and we expect the people who buy them to know what they are looking for.

Is there any point using structured data in this case?

10.02% popularity Vote Up Vote Down


Report

 query : How can I speed up my wordpress website? My website is based on a Wordpress and I want to know why its loading metrics are so bad. What can I do practically? I've already tried to use:

@Ravi8258870

Posted in: #PageSpeed #Wordpress

My website is based on a Wordpress and I want to know why its loading metrics are so bad. What can I do practically?

I've already tried to use:


Cloudflare
W3 Total Cache
Autoptimize
EWWW Image Optimizer

10.01% popularity Vote Up Vote Down


Report

 query : Is the Google cache of my site = the last time they indexed it? The Google cache of my web is a week old, and I would have expected them to have visited the site since then and indexed

@Ravi8258870

Posted in: #Google #GoogleCache

The Google cache of my web is a week old, and I would have expected them to have visited the site since then and indexed new content.

Does the cache lag the data collection or is it an actual indication of when the site was last crawled?

10.01% popularity Vote Up Vote Down


Report

 query : Re: How can I stop this PHP Notice or fix the problem that clutters up the debug.log after switching to PHP 7? EDIT: I am the administrator of several websites and when I turn debug on to track

@Ravi8258870

HTTP_ACCEPT may well be the correct environment var name, but that header won't necessarily be set, which is the cause of the warning. This is not a difference between PHP 5.6 and PHP 7 but more a difference in PHP's error reporting setting (the default of which might have changed between versions).

Since the header may not be set, checking whether or not it is before performing the regular expression check will fix the warning:

if( isset($_SERVER["HTTP_ACCEPT"]) && preg_match( "/wap.|.wap/i", $_SERVER["HTTP_ACCEPT"] ) )
return true;

10% popularity Vote Up Vote Down


Report

 query : HttpS pages crawled while never bought SSL Certificate I have a website built in WP. I never bought any ssl certificate for my website, nor a single link pointing to httpS, however, 02 of

@Ravi8258870

Posted in: #Http #Https

I have a website built in WP. I never bought any ssl certificate for my website, nor a single link pointing to httpS, however, 02 of the pages are crawled with httpS, while rest pages are with http. While Google crawled with httpS. And now when someone is landing to this page using serp, it's showing an error message.

I talked to the hosting company, initially they said that the free ssl was issued to your website (which i never requested), but when i created an ticket, the support team is now telling that its search engine issue, we can't do anything.

Shall i redirect the 02 pages from https to http ?

10% popularity Vote Up Vote Down


Report

 query : Re: How to check on the 99.99% Up time Guarantee? I subscribed to a Virtual Private Server (VPS) which promises 99.99% of up time guarantee. But I often encounter down time, slow access time and

@Ravi8258870

In addition to the above suggestions...

Assuming that you're self-managing the VPS, make sure the problems aren't of your own making or a result of a VPS provisioned with inadequate resources to handle the work you're asking it to do, especially under peak load.

Once you're certain that the problem is upstream of your VPS, if the downtime is frequent, my advice would be to do a well-planned, well-executed move elsewhere rather than engaging in a battle to hold your current provider to their SLA. The SLA probably has more loopholes than a meeting of an old ladies' crocheting club, anyway.

In short, if you're absolutely certain that your downtime and slow performance are the host's fault, I suggest you stop trying to "get them" and get gone instead.

10% popularity Vote Up Vote Down


Report

 query : Re: AngularJS application and searchengine optimization, empty Google cached page version I have created a website using AngularJS. I'm not sure this website will be properly indexed by Google and

@Ravi8258870

You definitely want to find/build a backend rendering tool for your initial view. If you view source on those pages, you won't find any meaningful content for the bots to scan. The robots aren't rendering a page, so 90% of what matters is the text content of the HTML file the robot loads.

You'll at the very least want your description and reviews to load (reviews because of freshness and diversity of content). You don't need something complex like prerender. If you have trouble, you can dynamically backend populate the description/reviews in the ui-view without even running angular on the backend. Angular will delete everything in the ui-view on render and replace it with the page, but that's okay as long as you're replacing it with the same content.

10% popularity Vote Up Vote Down


Report

 query : GTM Size Budgets - What's an average Google Tag Manager payload size (kB/files/other)? On a landing page I've built, 80% of what the user downloads in terms of kB and even more in terms of

@Ravi8258870

Posted in: #GoogleTagManager #Optimization #Tracking

On a landing page I've built, 80% of what the user downloads in terms of kB and even more in terms of number of files comes from GTM. It's an extremely condensed site outside of the GTM implementation, but it has left us with the question of how large a GTM payload can reasonably be. How can we set ourselves a budget? Is there an average size to compare with? Even if the load is async, there are concerns of the user downloading too much useless nonsense and the extra js slowing down mobile devices.

10% popularity Vote Up Vote Down


Report

 query : 301 Redirects and Merchant center quality I moved a site to a new platform and the url structure was updated. As far as the Google Merchant Center goes will the 301 redirects suffice or do

@Ravi8258870

Posted in: #301Redirect #GoogleAdwords #Shopping

I moved a site to a new platform and the url structure was updated. As far as the Google Merchant Center goes will the 301 redirects suffice or do I need to update my urls within merchant center?

10% popularity Vote Up Vote Down


Report

 query : How do you redirect a reputable domain name to a spelling variation without losing SEO value? We have a reputable news domain which is about 17 years old. We want to redirect this domain from

@Ravi8258870

Posted in: #301Redirect #GoogleNews #GoogleSearch #Seo

We have a reputable news domain which is about 17 years old. We want to redirect this domain from example-all.tld to exampleall.tld. We will implement a 301 permanent redirect for all URLs.

Now traffic is coming from search on Google for the brand name and business keywords. For these keywords we are first in the search results for pretty much all search engines. We have many organic backlinks for example-all.com domain, Moz Domain Authority 64 & Page Authority 70.

When we redirect the domain, what types of SEO problems will befall the website? Will we get the same value of previous domain for backlinks, Google search, reputation, etc? Is there any other method for this redirect from an SEO perspective?

10% popularity Vote Up Vote Down


Report

 query : Block company domain from apache website how? so I am running a site on apache. I want to block all visitors from domain *.xyz.com I tried adding the following in .htaccess file: <Files

@Ravi8258870

Posted in: #Apache #Htaccess

so I am running a site on apache. I want to block all visitors from domain *.xyz.com

I tried adding the following in .htaccess file:

<Files *>
order allow,deny
allow from all
deny from .*example.com.*
</Files>


this didnt work. Can anyone help?

10.01% popularity Vote Up Vote Down


Report

 query : General Data Protection Regulation (GDPR) is a new legal framework in the EU. It was enacted by the European Parliament, the Council of the European Union, and the European Commission, with

@Ravi8258870

General Data Protection Regulation (GDPR) is a new legal framework in the EU. It was enacted by the European Parliament, the Council of the European Union, and the European Commission, with the intention to strengthen and unify data protection for all individuals within the European Union (EU). It will apply starting on May 25, 2018.

10% popularity Vote Up Vote Down


Report

 query : Does Google AdWords revisit the landing page to see if it changed? If I change the landing page of my ads on Google AdWords to fit better with the exact wording on the ads, will Google notice

@Ravi8258870

Posted in: #GoogleAdwords

If I change the landing page of my ads on Google AdWords to fit better with the exact wording on the ads, will Google notice this change and update the ads' quality score?

10.01% popularity Vote Up Vote Down


Report

 query : Google Tag Manager: Extract value from a Custom JS Object I am implementing schema.org markup using JsonLD and Google tag manager. So I have a list of item that looks like this: data.itemListElement.push(

@Ravi8258870

Posted in: #GoogleTagManager #Javascript #JsonLd #SchemaOrg #Seo

I am implementing schema.org markup using JsonLD and Google tag manager.
So I have a list of item that looks like this:

data.itemListElement.push(
{
"@type": "ListItem",
"position": "2",
"item": {
"@type": "Product",
"image": "{{SCHEMA - Product Image}}",
"url": "{{SCHEMA - Product URL}}",
"name": "{{SCHEMA - Product Name}}",
"brand": "{{SCHEMA - Product Brand}}",
"category": "{{SCHEMA - Product Category}}",
"aggregateRating": {
"@type": "aggregateRating",
"ratingValue": "5",
"reviewCount": "700"
}
}
}
);


And I am getting the list of the different variables using a custom javascript variable that looks like that:

function(){

var urlList = [];

var urls = document.querySelectorAll("div.title-box-product-catalogue a.ng-binding");

if (urls) {
for(i = 0; i < urls.length; i++){
urlList.push({'url':urls[i].href});
}
}
return urlList;
}


It does return the list of url of my product list so this is working fine.

My problem is about using this array in my tag. I can not access the values.

I tried many things like:

"{{SCHEMA - Product URL}}.0.url",

"{{SCHEMA - Product URL.0.url}}",

"{{SCHEMA - Product URL}}[0].url",


But none seems to works, what I have in preview (and live) mode is always an Object list like this:



But I check theses "object" are filled with the right URLs,
Any idea about how I can access these ?

10% popularity Vote Up Vote Down


Report

 query : GDPR compliant contact form As of May 2018, the General Data Protection Regulation (GDPR) will come into affect and I am wondering how best to comply with this when implementing a simple contact

@Ravi8258870

Posted in: #Compliance #ContactPage #Forms #Gdpr

As of May 2018, the General Data Protection Regulation (GDPR) will come into affect and I am wondering how best to comply with this when implementing a simple contact form.

The form, lets say, requires the person's name, email, and has an optional telephone and message field. The form data is then sent to an email address, as well as being stored in a database.

I understand that we must:


Explain what personal information is being used for any why
Give the user a means to easily see the data that is being held
Give the user the option to remove this data.


Whilst these steps are ok, I have also read that we are obliged to confirm the user's identity - the suggested method is a double opt-in. Surely this can't apply to a contact form?

So in summary, what should be done to allow the user to fill out the form, but for us to comply with the new regulations?

10% popularity Vote Up Vote Down


Report

 query : Avoid email address being black listed when contacting writers I want to email people who wrote articles which are related to my software. Not articles about my software, just articles that make

@Ravi8258870

Posted in: #Domains #Spam

I want to email people who wrote articles which are related to my software. Not articles about my software, just articles that make me think that the author might want to write an article about my software, or at least, mention it in an article.

I realize that some or most of my email will not bear fruit. That's fine.

What I'm afraid of is my domain being black listed as a spam email address.

I'm not planning on sending more than 10 emails a week, and probably much less, but it will, of course, be something that is "spammy" at least in the sense that is not solicited, and might include problematic words like "free", "new", etc.

Is there a real chance of this amount of email causing a domain to be black listed? And if so - is there a solution short of getting a new domain dedicated for these emails (which, at least, will separate my main domain name from these emails)?

I'm assuming that undoing black listing is very difficult or impossible, so just taking a chance isn't worth it. Please correct me if I'm wrong about this.

10.02% popularity Vote Up Vote Down


Report

 query : How to filter product performance by past page view in Google Analytics? I would like to compare a product's performance with no filter (all users segment) VS when the user has also been on

@Ravi8258870

Posted in: #AdvancedSegments #Ecommerce #GoogleAnalytics

I would like to compare a product's performance with no filter (all users segment) VS when the user has also been on a given page (eg. a similar product's page).

I tried my hand at a advanced custom segment but apparently mixing "hits" with "user" segments doesn't work. The data it returns is absurd (returns the same number of purchases as the all users segment).

How should I go about this?

10% popularity Vote Up Vote Down


Report

 query : Can a domain be obtained and used without a domain registrar? I guess the greater question should be Why is a domain registrar necessary, but does one for ease of the average man. I understand

@Ravi8258870

Posted in: #DomainRegistrar #Domains #WebHosting

I guess the greater question should be Why is a domain registrar necessary, but does one for ease of the average man. I understand that the purpose of a registrar is so that way all people have to do is type into their browsers www.example.com/most_trafficked_domain_ever and they are taken to your physical server location, regardless of if you move.

That being said, is it not possible for them to connect directly to your domain without a registrar?

*I assume it is, that is how "some parts" of the internet work.

10.04% popularity Vote Up Vote Down


Report

 query : Apache HTTP server and redirect-carefully I checked a server's configuration file and saw 'redirect-carefully' setting. Checking the docs: redirect-carefully This forces the server to be

@Ravi8258870

Posted in: #Apache #Redirects

I checked a server's configuration file and saw 'redirect-carefully' setting.

Checking the docs:


redirect-carefully

This forces the server to be more careful when sending a redirect to
the client. This is typically used when a client has a known problem
handling redirects. This was originally implemented as a result of a
problem with Microsoft's WebFolders software which has a problem
handling redirects on directory resources via DAV methods.

httpd.apache.org/docs/2.4/env.html
This does not say much on how exactly the server behaves differently when redirect-carefully is set.

What exactly does it mean to "be more careful when sending a redirect"?

10.01% popularity Vote Up Vote Down


Report

 query : Site is ranking well nationally but lagging behind locally I have a trade school client in a fairly competitive, SF Bay Area market. They're competing with a couple of national brands and

@Ravi8258870

Posted in: #LocalSeo #Seo

I have a trade school client in a fairly competitive, SF Bay Area market. They're competing with a couple of national brands and a large handful of entrenched competitors, all of whom are pursuing an active paid and organic search strategy.

We're on month 4 of a dedicated SEO push and tracking our SERPs for a large volume of keywords, we've been steadily moving in the right direction almost across the board. We relaunched the site in June (~4 months ago) and had a few initial hurdles to get over (their old site had been hacked with a number of spam pages added).

We've been very focused on local - Targeting a handful of geo-qualifiers ("Bay Area", "East Bay" and a few individual town/city names) in addition to all of the foundational pieces around Google MyBusiness, directories & data aggregators, etc. We made a large round of sledge-hammer style, on-page updates in early July and have been making smaller, data-driven on-page updates since (as well as adding focused content).

The Question:

We've gotten to the point where we're ranking first page nationally for a large handful of these geo-qualified keywords and search phrases, but still lagging in the 3rd-6th page when pulling the searches locally using both MOZ and SERPS.com.

Does this seem odd? Any high-level suggestions beyond time? Ranking nationally, while cool I suppose, doesn't do much for the business.

A glaring example: Using MOZ, we rank 6th nationally for "Bay Area Beauty School". The same search originating from the school's city ranks 66th =/ ?!?!?!

What we've done:

-MyBusiness/Places 100% complete with accurate info
-Social Media/Review/Directory/Data Aggregators 100% complete and consistent across the board
-Dedicated link building, including a couple of local newspapers, Better Business Bureau, and others.
-Focused review campaign with a Google first/Yelp second strategy (purposely spread out to avoid too many at once)
-The local space is competitive with 10+ similar schools and a lot of lead-generation ('Find a School Near You') directory sites pushing for the same targets.

We generally top the localpack when searched from a 1-2 town radius from their location but quickly fall off to "under the fold" as soon as you get beyond that. Competitive schools significantly further away are pushing us out even 2 towns away.

Thanks for any thoughts and advice.

10% popularity Vote Up Vote Down


Report

 query : What is the SEO impact of text appearing on mouse over As a webdesigner, I wonder if a design where the texts is displayed only when your mouse is over an image would be impacted in term

@Ravi8258870

Posted in: #HiddenText #Seo

As a webdesigner, I wonder if a design where the texts is displayed only when your mouse is over an image would be impacted in term of SEO. (it would be almost all the texts, except on device without mouse)

For example: www.w3schools.com/howto/tryit.asp?filename=tryhow_css_image_overlay_fade
But with proper heading and paragraphs.

I think of many way to achieve that in a technical aspect, but I have no clue about the impact on the SEO.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Google Analytics hostname segment has results with other hostnames I use one Google Analytics tracking code across multiple subdomains. I have created a segment for a specific subdomain (lookatme.mysite.com)

@Ravi8258870

)
Segments are session based. Pageviews are hit based.

The segment you created will only include the sessions that included the specified sub-domain, however if then looking at the pageviews, it will also include the pageview details associated with any of the other sub-domains or parent domain that were visited during that same session.

If you are going to need to seperate out the pageview data for the subdomain on a regular basis, you may do better to create a new View with an include hostname filter for the relevant subdomain.

You may also want to try modifying your segment to add another condition so that the unwanted main domain and other subdomains are not included... something along the lines of the following example


Further info here on when to use segments v filters support.google.com/analytics/answer/7331978?hl=en
and following article helps to explain it a bit more clearly brianclifton.com/blog/2009/12/09/how-to-choose-between-advanced-segments-versus-profile-filters-in-google-analytics/ (im not affiliated with the website/it's owner or anything the site may be promoting/selling)

10% popularity Vote Up Vote Down


Report

 query : Re: .htacess - Remove .html from the homepage but not from folders within within that website I'm trying to remove the .html from my landing page, an example would be instead of domain.com/contact.html

@Ravi8258870

#301 from example.com/page.html to example.com/page
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9} /.*.html HTTP/
RewriteRule ^(.*).html$ / [R=301,L]



Let's focus on this one, as the one you want to change. Firstly I would say the THE_REQUEST is just adding complication. I guess its purpose is to 'stop' the rule adding the html back, removed during the first 'internal' redirect.

... I would suggest that it's much better just to 'fix' that by putting the URL first. ie. have the 301 removing the extension FIRST. (It has L in it anyway.) Then have the internal redirect.

Then it seems you just need

RewriteRule ^([^/]+).html$ / [R=301,L]


[^/] means match anything except a slash. So then paths with a slash in them (eg folders!) won't match.

You may also want to consider adding:

RewriteBase /


to the beginning of the file to make sure that relative paths work. ie that the rewrite rule isn't matching a path that already has slashes at start.

Alternatively use:

RewriteRule ^/?([^/]+).html$ / [R=301,L]

10% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme