logo vmapp.org

@Yeniel560

Yeniel560

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Re: Can I set up two WordPress servers; a Read-Write server on a local Intranet which is then synced to a Read-Only public Internet server? Background I've got a website set up running WordPress.

@Yeniel560

Answer

Technically it's possible, but it way too complicated. There are better ways.

Some Reasons

Some issues that I ran into when trying this:


You have to sync both the DB files, AND the physical "WP Uploads" folder at the same time, anytime a change is made.
You'd have to figure out a way to have author attribution on the front end, on the Live server that doesn't have user accounts. Only way that I can think to do this is to mirror the Users table with a CTP, called UsersFake. Have a hidden Metadata text field on Posts, which copies the Post's Author. Then use the FakeUsers to create the front end "Author" section at the end of each blog post. I played around with it a bit but it got really complicated really fast.


Basically, it's not worth it.

What I Did Instead

The direction I went instead is to combine IP locking for logins, with a snapshot backup system on the live server.

10% popularity Vote Up Vote Down


Report

 query : Google Analytics data from previous years/months has changed We use Google Analytics to get stats for reporting at the end of each year. After taking all the statistics for 2016, I have entered

@Yeniel560

Posted in: #GoogleAnalytics #PageViews

We use Google Analytics to get stats for reporting at the end of each year. After taking all the statistics for 2016, I have entered same date range in (Jan 1 2016 - Dec 31 2016) months later and all of the data has increased quite significantly. For example, total page views for the year were 16,000 and now they are 18,400???

Can anyone advise why it is doing this or at least if it is normal?

10% popularity Vote Up Vote Down


Report

 query : Can I set up two WordPress servers; a Read-Write server on a local Intranet which is then synced to a Read-Only public Internet server? Background I've got a website set up running WordPress.

@Yeniel560

Posted in: #WebHosting #Wordpress

Background

I've got a website set up running WordPress. I have multiple people who need login access to manage the software. They all access the website from the same physical location, and I have access to a company intranet.

Standard Practice (?)

Normally, they would log in directly to the website, and their edits would be made to the live database.

My Goals


I want the public website to be inaccessible as much as possible. I want zero WordPress logins.
I want data duplication, in case the live server goes down.


Plan

I want to set up a two part system, with an internal copy of the website that is hosted on a company intranet, accessible only to users who are on the internal network. They will make edits there.

I am planning then to use MySQL Master-Slave replication to mirror the data from the intranet copy (Master) to the live copy (Slave), with some changes on the live copy, such as disabling user logins completely.



Question

Is this a good plan? Or is there a better option? (I am a front end dev who has been thrown into DB management, so there are probably lots of things I'm not aware of.)

10.03% popularity Vote Up Vote Down


Report

 query : Re: Do Child Categories have any influence on Parent Categories, when it comes to SEO/SERP? I am currently working on a WordPress e-commerce website, where the chosen shopping platform is WooCommerce.

@Yeniel560

Yes. What you're doing there is building a proper semantic core which will help you optimize category for the topic rather than optimizing a single page for a keyword.

If done properly with right keywords and internal page rank distribution you'll get what you described in your question.

10% popularity Vote Up Vote Down


Report

 query : Re: Multilingual site: offer translation on same domain or new domain? I have to translate my website in another language, and I was wondering: from a SEO point of view, is it better to have the

@Yeniel560

A separate domain is better for the following reasons:

You can host a new website on a separate server, in the target country, which will improve loading speed.

You provide clear geotargeting signals for users and search engines.

Simple separation process.

10% popularity Vote Up Vote Down


Report

 query : Not use Google Analytics could penalize page rank in Google Search? I am using Yandex Metrika as tool for analytics. However I am concerned about the impacts in terms of SEO. For example. How

@Yeniel560

Posted in: #GoogleAnalytics #GoogleSearch #Seo #Yandex

I am using Yandex Metrika as tool for analytics. However I am concerned about the impacts in terms of SEO. For example. How Google knows how much time the users stay in this website? I mean, things like bounce rate, direct traffic, these type of things will be hidden to Google since I am not using Google analytics?

Or Yandex Metrika share these type of data with Google? I am afraid that I am losing some good data to improve my page ranking in Google Search. Anyone could clarify if my concerns have some logic?

10.02% popularity Vote Up Vote Down


Report

 query : Will creating an app that user prefer to the website reduce search engine rankings due to lower usage of the site? If all users use only iPhone or Android app instead of use the website,

@Yeniel560

Posted in: #Pagerank #SearchEngines #WebCrawlers

If all users use only iPhone or Android app instead of use the website, will I lose rank of my website on Google? I think when more users visit the website, the rank of website is better in search engines.

How can I prevent lost rankings by creating a popular app?

10.01% popularity Vote Up Vote Down


Report

 query : PHP syntax issue I'm reading a Sentry tutorial for implementing an authentication workflow in a webapp. I'm a PHP beginner. There's a thing I don't understand in this code: <?php // set up

@Yeniel560

Posted in: #Authentication #Php

I'm reading a Sentry tutorial for implementing an authentication workflow in a webapp. I'm a PHP beginner. There's a thing I don't understand in this code:

<?php
// set up autoloader
require ('vendorautoload.php');

// configure database
$dsn = 'mysql:dbname=appdata;host=localhost';
$u = 'sentry';
$p = 'g39ejdl';
CartalystSentryFacadesNativeSentry::setupDatabaseResolver(
new PDO($dsn, $u, $p));
[...]


What is the meaning of CartalystSentryFacadesNativeSentry? What's happening there? Is it a server file path? While debugging I found that this line doesn't execute.

10.01% popularity Vote Up Vote Down


Report

 query : How to create email address which not affect when servers changing? I change namservers almost every year because I change web hosting companies. I have plan to use Zoho for my emails. So how

@Yeniel560

Posted in: #Dns #Email #Zoho

I change namservers almost every year because I change web hosting companies.

I have plan to use Zoho for my emails.

So how to create email address which not affect when servers changing? I mean even I change nameservers emails should be working without a problem.

10.02% popularity Vote Up Vote Down


Report

 query : Both HTTP and HTTPS showing up in Google SERPs Both http://example.com and https://example.com are showing up as duplicate content in the Google results. How do you remove the HTTP version from

@Yeniel560

Posted in: #DuplicateContent #GoogleSearchConsole #Https #Seo #Serps

Both example.com and example.com are showing up as duplicate content in the Google results. How do you remove the HTTP version from Google index but keep the HTTPS version using Google Search Console? Whenever I temporarily block example.com, it removes both HTTP and HTTPS versions from the index.

10.01% popularity Vote Up Vote Down


Report

 query : 404 errors turning into 500 errors through internal redirects Today I checked my Apache error.log and I found many: Request exceeded the limit of 10 internal redirects due to probable configuration

@Yeniel560

Posted in: #Apache #Htaccess

Today I checked my Apache error.log and I found many:

Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace.


...mostly from Bing bot.

So i traced the error to my access log and I found that these errors are being caused by some site/bots requesting a file that no longer exists. Which should create 404 (not found) error, but instead it is redirecting the site to look somewhere else and then that place is telling it to look in the first place, causing an internal redirect which only stops once it reaches the maximum of 10 internal redirects. At this point it puts out a 500 internal error and it's crawled over again and again.

Subdomains are in:

/home/html/myname/public_html/_sub/thisisasubdomain/

Resulting domain is:

thisisasubdomain.myname.com

When I try to access a non existing directory in the _sub directory, then I get the 500 error/Request exceeded error.

My VPS has pre-installed server with this rewrite.conf, defined in :

RewriteEngine On
RewriteMap lowercase int:tolower

RewriteCond %{REQUEST_URI} !^/webmail(/|$)?
RewriteCond %{REQUEST_URI} !^/horde(/|$)?
RewriteCond %{REQUEST_URI} !^/roundcube(/|$)?
RewriteCond %{REQUEST_URI} !^/setup(2|-new)?(/|$)?
RewriteCond %{REQUEST_URI} !^/webftp(1|2)?(/|$)?
RewriteCond %{REQUEST_URI} !^/(db|pg)admin(/|$)?

RewriteCond ${lowercase:%{SERVER_NAME}} ^([0-9a-z-]+).([0-9a-z]+)$
RewriteRule ^(.+) ${lowercase:%{SERVER_NAME}} [C]
RewriteRule ([0-9a-z-]+).([0-9a-z]+)/(.*) /home/html/./public_html/

RewriteCond %{REQUEST_URI} !^/webmail(/|$)?
RewriteCond %{REQUEST_URI} !^/horde(/|$)?
RewriteCond %{REQUEST_URI} !^/roundcube(/|$)?
RewriteCond %{REQUEST_URI} !^/setup(2|-new)?(/|$)?
RewriteCond %{REQUEST_URI} !^/webftp(1|2)?(/|$)?
RewriteCond %{REQUEST_URI} !^/(db|pg)admin(/|$)?

RewriteCond ${lowercase:%{SERVER_NAME}} ^www.([0-9a-z-]+).([0-9a-z]+)$
RewriteRule ^(.+) ${lowercase:%{SERVER_NAME}} [C]
RewriteRule ([0-9a-z-]+).([0-9a-z]+)/(.*) /home/html/./public_html/

RewriteCond %{REQUEST_URI} !^/webmail(/|$)?
RewriteCond %{REQUEST_URI} !^/horde(/|$)?
RewriteCond %{REQUEST_URI} !^/roundcube(/|$)?
RewriteCond %{REQUEST_URI} !^/setup(2|-new)?(/|$)?
RewriteCond %{REQUEST_URI} !^/webftp(1|2)?(/|$)?
RewriteCond %{REQUEST_URI} !^/(db|pg)admin(/|$)?

RewriteCond ${lowercase:%{SERVER_NAME}} !^www.([0-9a-z-]+).([0-9a-z]+)$
RewriteCond ${lowercase:%{SERVER_NAME}} ^([0-9a-z-]+).([0-9a-z-]+).([0-9a-z]+)$
RewriteRule ^(.+) ${lowercase:%{SERVER_NAME}} [C]
RewriteRule ([0-9a-z-]+).([0-9a-z-]+).([0-9a-z]+)/(.*) /home/html/./public_html/_sub//

RewriteCond %{REQUEST_URI} !^/webmail(/|$)?
RewriteCond %{REQUEST_URI} !^/horde(/|$)?
RewriteCond %{REQUEST_URI} !^/roundcube(/|$)?
RewriteCond %{REQUEST_URI} !^/setup(2|-new)?(/|$)?
RewriteCond %{REQUEST_URI} !^/webftp(1|2)?(/|$)?
RewriteCond %{REQUEST_URI} !^/(db|pg)admin(/|$)?

RewriteCond ${lowercase:%{SERVER_NAME}} ^(.+).([0-9a-z-]+).([0-9a-z-]+).([0-9a-z]+)$
RewriteRule ^(.+) ${lowercase:%{SERVER_NAME}} [C]
RewriteRule (.+).([0-9a-z-]+).([0-9a-z-]+).([0-9a-z]+)/(.*) /home/html/./public_html/_sub//


Any help is appreciated.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Does Google AdWords revisit the landing page to see if it changed? If I change the landing page of my ads on Google AdWords to fit better with the exact wording on the ads, will Google notice

@Yeniel560

Definately, It will revisit every time you make some changes in your ads slot. But, If you keep everything as it is like ad copy, url and keywords same but only makes changes to landing page then it won't as you are not changing the ad copy but once you change it will check for sure.

10% popularity Vote Up Vote Down


Report

 query : Why does Namecheap keep sending me a message that says "The AutoSSL certificate renewal may cause a reduction of coverage" I'm working on a client's website that was set up by someone else.

@Yeniel560

Posted in: #Cpanel #Https #Namecheap #SecurityCertificate #WebHosting

I'm working on a client's website that was set up by someone else. It's hosted by Namecheap and we have access to cPanel.

The client is apparently receiving this email message twice a day (I've replaced their domain name with example.com):


[example.com] ⚠ example.com: The AutoSSL certificate renewal may cause a reduction of coverage on 2017-11-28 at 00:00:00 UTC

The “cPanel” AutoSSL provider could not renew the SSL certificate without a reduction of coverage because of the following problem:

The system failed to fetch the DCV (Domain Control Validation) file at example.com/.well-known/pki-validation/(long string of letters and numbers).txt because of an error: The system failed to send an HTTP (Hypertext Transfer Protocol) “GET” request to example.com/.well-known/pki-validation/(long string of letters and numbers).txt because of an error: Size of response body exceeds the maximum allowed of 16384

The domain “example.com” resolved to an IP address “(the IP address of our website)” that does not exist on this server.


I've been Googling around but there doesn't seem to be much about this. I looked through the "SSL/TLS Manager" section of cPanel but it appears to be blank. They've been using HTTP this whole time so it doesn't seem like it would be a case of an SSL certificate expiring.

What could be causing this kind of error and what kind of steps should I take next?

10.02% popularity Vote Up Vote Down


Report

 query : Re: User contributed images on a HTTPS site without mixed content warnings I have a forum where, like most forums, users can post images. I've set up HTTPS across the site, but of course most

@Yeniel560

Just realised that I never posted my solution. The answer provided in Stephen's comment is what solved it for me. In short, I created a proxy script that does the following:


If the image is https, leave it alone.
If the image is http and from a site known to support https (e.g. my own site, imgur.com etc) then rewrite to https.
Otherwise if the image is http, rewrite it to use example.com/imgproxy?img=ORIGINALURL&hash=KEY

Then the proxy script fetches the HTTP image, caches it locally, and outputs the image data. On repeated requests it outputs the cached data directly. The linked SO answer describes the security hash and other details.

10% popularity Vote Up Vote Down


Report

 query : Re: Why is Coffeyville, Kansas sending large amounts of traffic in Google Analytics? I work for a UK-based digital-focused company. We have a few thousand US-based users but it is by no means

@Yeniel560

Take a look at your server logs and see if you can find an IP address (or small range) that are sending a lot of hits your way. Try looking up the IPs in an online tool like ip-lookup.net which will give you a location and a domain or ISP the IP is associated with.

In your server logs the lines with those IPs should have a user-agent string (browser name). That may give you a clue what's happening, if for example it says "Googlebot". If not then try searching the IPs in Google itself to see if they belong to a well-known company.

10% popularity Vote Up Vote Down


Report

 query : Should I ignore W3 CSS Validation Service? I just checked my site with W3 CSS Validation Service. It had more than 200 errors. Then I checked many sites including Stack OverFlow. Here is

@Yeniel560

Posted in: #Css #Seo #Validation #W3cValidation

I just checked my site with W3 CSS Validation Service. It had more than 200 errors.

Then I checked many sites including Stack OverFlow. Here is the results of Stack OverFlow. SO also has 108 errors.

Seems like SO also ignored it. What should I do? If we ignore it, will it affect to SEO wise? I mean a problem like Google can't properly read our website, etc.

10.02% popularity Vote Up Vote Down


Report

 query : Registering a transaction using only tag manager Is there a way to register a transaction in a website using nothing but the Tag Manager? So without any modification of the website itself? I

@Yeniel560

Posted in: #GoogleAnalytics #GoogleTagManager

Is there a way to register a transaction in a website using nothing but the Tag Manager? So without any modification of the website itself? I already have a trigger that fires at the right moment that registers an event.

I don't care about the value of the transaction id, or the total value of the transaction; except they're required so I need to find a way of generating a unique transaction id.

I am aware how pointless this makes a transaction, but I have a customer who insists because ... I just don't know.

10.01% popularity Vote Up Vote Down


Report

 query : Re: Is shared hosting appropriate for a forums site, or do forums need something better? My website attracts fairly medium amount of traffic. Does a forum need something stronger? I think choosing

@Yeniel560

Separate the database server and the web server in your considerations, because it's about how many web pages you need to serve, and how many database hits you need to do to generate and serve those pages and manage forum content.

So it could could be you only need a small web server, but a bigger database server for example. That might be the case where you have a smaller community who's extremely active in posting.

Really since it's hard to know the correct answer, but with what I've said above in mind, you're better off choosing a hosting service that allows you to change up and down your server sizes. Start small, see how the performance is. If it is fine, don't change anything. If you see your database showing some load issues, up your database machine. Maybe it needs more memory, maybe it needs more CPU, maybe both. If your web server is showing problems, up it.

You won't know for sure until you try it.

10% popularity Vote Up Vote Down


Report

 query : Nginx: set Content-Type by URI problem I'm setting up small personal file hosting, I made a backend to store uploaded files by their hash directly on the HDD like /f/ab/cd/ef012345678..., but

@Yeniel560

Posted in: #Cache #ContentType #Nginx #UrlRewriting

I'm setting up small personal file hosting, I made a backend to store uploaded files by their hash directly on the HDD like /f/ab/cd/ef012345678..., but without file extensions and generate an URI like example.com/f/ab/cd/ef012345678.../filename.jpg.

Nginx serves the files right from HDD using this config:

location /f {
rewrite "^/f/([a-z0-9]+)/([a-z0-9]+)/([a-z0-9]+)(.*)$" /// break;
root /srv/cdn/f;
}


I.e. it cuts the last part of the URI and reads the file using only it's hash. It works well, but it strips file's mime types to application/octet-stream which forces browser to download all files, but it to display images inlined.

I made a workaround by including in this location a piece of config like this:

if ($request_uri ~* .png$) {
add_header Content-Type "image/png";
}


Which fixes the problem for Google Chrome and for the first load in Firefox.

But when I open the link the second time in Firefox or just reload the page, it prompts me to download it. Apparently, Firefox is handling Content-Type with 304 answer poorly (I checked headers, Nginx sends 304 response and Content-Type header with this configuration if file was pulled from the cache).

I'm looking for way to either determine in Nginx config if the file was server from cache and not add the header of for better Nginx solution.

I know the better idea is to change my backend and add file extensions, it will solve some future problems too, but now I can't sleep and need to know the answer using Nginx only if there is one.

10.01% popularity Vote Up Vote Down


Report

 query : Re: URLs with 'NoIndex` in robots.txt are being indexed by Google In my robots.txt file (http://www.tutorvista.com/robots.txt), I'm using Noindex: /content/... to disallow indexing: This should mean

@Yeniel560

"noindex" directives should not be used in your robots.txt file, instead a noindex meta tag should be added to any pages that you don't want indexed in Google.

A NOINDEX tag looks like the below and it should be placed in the section of any page you do not want indexed:

<meta name="robots" content="noindex">


More information can be found here.

In the second example while you do have "Disallow: /biology/" in your robots.txt file, a few lines above this you also have "Allow: /biology/animations/" hence why this page in indexed in your example.

Hope this helps!

10% popularity Vote Up Vote Down


Report

 query : Why Google Taking time to index my website I observed some of the web sites, their content indexing immediately whenever they created the data and results are showing in SERP, But why taking

@Yeniel560

Posted in: #Googlebot #GoogleSearch #GoogleSearchConsole #Seo

I observed some of the web sites, their content indexing immediately whenever they created the data and results are showing in SERP, But why taking time to index my web site content?. What are the factors involving to get the results as early as possible in SERP?

10% popularity Vote Up Vote Down


Report

 query : Re: Are there other options besides HTTPS for securing a website to avoid text input warnings in Chrome? One week ago Google sent me an email to go HTTPS. If I won't transfer HTTP to HTTPS then

@Yeniel560

nobody else seems to have mentioned,

if you own every machine that connects to your site

eg "this is probably not what you want"

, like a corporate setting, you can create a certificate authority of your own, install it's public cert into all the machines (well onto all the browsers and cert stores) that connect to your site. this option is free of third party monetization; it's the same encryption cipher, you don't get signed into the public trust though (eg your authority is not recognised by Google's Chrome, Mozilla's Firefox etc, the way Let's Encrypts's does) but it will be recognised by machines you configured to trust yourself.

admittedly the deals off setting up a certificate authority are tricky somewhat obscure and maintenance can be a lot of work- so I'll leave them off here, you're probably best doing a deep dive research on the topic of you're truly interested in this approach.

though for a nieve deployment, if you can try XCA


For Debian: packages.debian.org/stretch/xca official site: xca.sourceforge.net/

XCS is is a decent way to issue certs for tiny deployments and includes help documentation that walks through the whole setup.

10% popularity Vote Up Vote Down


Report

 query : Re: Url redirect in dreamhost without editing htaccess We have a network of sites with an old link to a URL that doesn't exist anymore. We would like to redirect all the traffic to this URL to

@Yeniel560

Let's have a look at the example 1 - Redirect example.com to example.com. The first line tells apache to start the rewrite module. The next line:

The final line describes the action that should be executed:

RewriteRule ^(.*)$ example.com/ [L,R=301]

RewriteCond %{HTTP_HOST} !www.example.com$


specifies that the next rule only fires when the http host (that means the domain of the queried url) is not (- specified with the "!") example.com.
The $ means that the host ends with example - and the result is that all pages from example.com will trigger the following rewrite rule. Combined with the inversive "!" is the result every host that is not example.com will be redirected to this domain.

The next part example.com/ [L,R=301] describes the target of the rewrite rule -this is the "final" used domain name, where contains the content of the (.*).

The next part is also important, since it does the 301 redirect for us automatically: [L,R=301]. L means this is the last rule in this run. After this rewrite the webserver will return a result. The R=301 means that the webserver returns a 301 moved permanently to the requesting browser or search engine.

Redirect visitors to a new site

You have an old website that is accessible under livemnc.com and you have a new website that is accessible under livemnc.in. Copying the content of the old website to the new website is the first step - but what comes after that? You should do a 301 moved permanently redirect from the old domain to the new domain - which is easy and has some advantages:

Search engines will be redirected to the new domain and all related information will be moved to the new domain (but this might take some time).
Google's PageRank â„¢ will be transfered to the new domain, as well as other internal information that is being used to set the position of pages in the search engine result pages (serp's) - like TrustRank .


Create a 301 redirect for all http requests that are going to the old domain.

Example 1 - Redirect from oldexample.com to newexample:
RewriteEngine On
RewriteCond %{HTTP_HOST} !oldexample.com$ [NC]
RewriteRule ^(.*)$ newexample.com/ [L,R=301]

RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !example.php
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ example// [L,R=301]


Explanation of the add trailing slash .htaccess rewrite rule:

The first line tells Apache that this is code for the rewrite engine of the mod_rewrite module of Apache. The 2nd line sets the current directory as page root. But the interesting part is:

RewriteCond %{REQUEST_FILENAME} !-f


makes sure that existing files will not get a slash added. You shouldn't do the same with directories since this would exclude the rewrite behavior for existing directories. The line

RewriteCond %{REQUEST_URI} !example.php


excludes a sample url that should not be rewritten. This is just an example. If you do not have a file or url that should not be rewritten, remove this line. The condition:

RewriteCond %{REQUEST_URI} !(.*)/$

RewriteRule ^(.*)$ example.com// [L,R=301]


does the 301 redirect to the url, with the trailing slash appended. You should replace example.com with your domain

10% popularity Vote Up Vote Down


Report

 query : Why does Ahrefs still report low organic traffic for our site? I'm using Ahrefs.com for backlink analysis. The site has grown significantly in last months - this website. The number of indexed

@Yeniel560

Posted in: #Backlinks #GoogleAnalytics #Seo #Traffic

I'm using Ahrefs.com for backlink analysis.

The site has grown significantly in last months - this website. The number of indexed pages is around 1040 at the moment. Also, our blog has produced some decent, shareable (around 1-2K shares) content recently. The number of organic keywords in top 100 is higher than our competitors.

In all these categories, site outperforms some competitors. Despite this, Ahrefs is reporting organic traffic 2-3 times lower than them. Why?

10.01% popularity Vote Up Vote Down


Report

 query : Huge tables won't fit on mobile, can I tell Google the page is desktop only? I'm working on a product comparison page that is composed of a giant table listing the attributes of a great many

@Yeniel560

Posted in: #MetaTags #Mobile #RobotsTxt

I'm working on a product comparison page that is composed of a giant table listing the attributes of a great many products. It's useful on a desktop for those who want to dig into the details of product specifications, but it's completely useless on a mobile device. I can't think of any way of showing a comprehensive table of all this info on a mobile and having it work at all.

It's easy to use media queries to hide the links to that page for users on mobile devices, but I'm concerned about search traffic.

I could exclude it in robots.txt but it might be nice to have that page indexed to pull in some desktop and tablet traffic.

Is there any kind of meta tag that can be used to tell Google to not display this page in the SERPs when the user is searching on a mobile?

10.01% popularity Vote Up Vote Down


Report

 query : Blog and BlogPosting for Google Schema.org has Article, Blog and BlogPosting. Structured Data Markup Helper only uses Article. If I understand Blog and BlogPosting are types of Article. So I suppose

@Yeniel560

Posted in: #Blog #SchemaOrg

Schema.org has Article, Blog and BlogPosting.

Structured Data Markup Helper only uses Article.

If I understand Blog and BlogPosting are types of Article. So I suppose it is just a way to be more precise.

I have a blog with a home page with all the titles and a brief summary and then a page with each post. Should I use Blog for the blog home page and BlogPosting for each post or I should use only Article?

10.01% popularity Vote Up Vote Down


Report

 query : Is redirecting old url to new url a bad idea with respect to SEO? My web application used to generate not so SEO friendly urls. They used to be like this: https://example.com/search?category_id=208

@Yeniel560

Posted in: #Redirects #Seo

My web application used to generate not so SEO friendly urls. They used to be like this:
example.com/search?category_id=208
(And no this is not a search result page. This url is generated only when a user is clicked a particular category link )

Now after doing some enhancements I am able to make the link generate a somewhat SEO friendly url as follows:
example.com/search?category_id=metal-processing-and-machine-tool
And now I am being asked to redirect old url to this new url instead of sending the old one to 404. I asked a SEO specialist if this right? He said "This is totally wrong, your website might get penalised for doing this". Is he right?

10.02% popularity Vote Up Vote Down


Report

 query : Google Analytics does not show any referrers from link to my site I created in a forum I added a link to my site on a forum. In source code I not see nofollow so this is follow link

@Yeniel560

Posted in: #Analytics #Backlinks #Google #Seo

I added a link to my site on a forum. In source code I not see nofollow so this is follow link which need be detect in Google Analytics - acquisitions - referral.

I clicked on this link which I added on forum, but I am not seeing it appear in Google Analytics? Why? Is this a good or bad back link?

10.01% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme