logo vmapp.org

@Voss4911412

Voss4911412

Last seen: Tue 11 May, 2021

Signature:

Recent posts

 query : Homepage not at the top of the results for a "site:" search on Google When I do site: on Google for my website, the homepage is nowhere in search results. In fact, I can't find it anywhere

@Voss4911412

Posted in: #Google #GoogleIndex #GoogleSiteSearch

When I do site: on Google for my website, the homepage is nowhere in search results. In fact, I can't find it anywhere at all.

If I search my brand name, it comes up and the pages are indexing. My website is indexed overall and I can't see anything on my site that would be blocking Google from crawling the homepage.

Will this have an impact at all? Is it something I should be worried about?

10.01% popularity Vote Up Vote Down


Report

 query : Can I permit access to sub-domain from the main domain using the domain name rather than IP address? Stepping into React I want to utilize WordPress's REST API to feed a React site and plan

@Voss4911412

Posted in: #Api #Htaccess #Wordpress

Stepping into React I want to utilize WordPress's REST API to feed a React site and plan to host the WordPress site on a subdomain like foo.bar.com and bar.com would be the hosted React portion.

I'd like to prevent anyone hitting the subdomain foo.bar.com except for the domain bar.com to have access. I'm curious if this can be done with the domain instead of the IP to prevent any issues with systems such as cloudflare. I'm familiar with using .htaccess to only allow an IP into certain folders in a domain but not a subdomain. When I research to see if this has been asked:


Only allow access to /directory from a specific domain?
Only allow access to /admin from a specific domain?
Access Website by domain name only (not IP address)
configure only allow specific domains to access certain folders using .htaccess


I'm not able to find a definitive solution or implementation. Is there a way in .htaccess to deny all except from the domain and when someone that isn't the domain make a redirect to foo.bar.com/404? I've redirected from a sub domain to the domain before with:

RewriteCond %{HTTP_HOST} !^example.com$ [NC]
RewriteCond %{HTTP_HOST} !^www.example.com$ [NC]
RewriteRule . "http://www.example.com/" [R=301,L]


but on the subdomain how can I indicate access for the domain?



Edit:

Apologies for the confusion. My goal is to prevent others except the main domain from accessing the API hosted at the sub-domain and since everything in React can be accessed. I've done redirects but never only allowed a domain and redirected others. I don't think an IP solution would be ideal if the initial main domain is also hosted at Cloudflare because that IP routinely changes I recall.

10% popularity Vote Up Vote Down


Report

 query : Re: Is the time PHP takes to render a page and serve it to the client ever a problem? PHP is a ubiquitous language that among other things allows a server to render a webpage on request and

@Voss4911412

This was an issue...20 years ago.

Sure, if you have other major issues such as poorly written php code or a server that has some serious problems with load and resource management it will eventually be problematic and manifest itself as "slow loading php pages". But the same would be true if you were using any server side scripting or calls to the databases; or really serving ANYTHING at all under those conditions.

I code exclusively data-driven php webpages on everything right now; and the php compiling and execution is NEVER an issue or slowdown for me.

Loading large resources such as jquery, css libraries, bootstrap, etc can possibly be an issue. But if you load them using async (whenever possible) then your page will continue to load and not wait on them either. And they will most likely be cached after the first time they're loaded, so zero time after that as well.

In short: While there are a lot of things to worry about when designing your web site and user interface, properly written PHP code is not one of them.

10% popularity Vote Up Vote Down


Report

 query : Should I be returning a 301 or 403 to a bad referrer in my .htaccess? Getting several amounts of bad requests from a particular URL such as site.foo and I would like to redirect all these

@Voss4911412

Posted in: #301Redirect #403Forbidden #Htaccess #Spam #Wordpress

Getting several amounts of bad requests from a particular URL such as site.foo and I would like to redirect all these requests to a dedicated page. The .htaccess code I've researched and implementing is:

RewriteCond %{HTTP_REFERER} site.foobar
RewriteRule ^ bar.com/notallowed.html? [R=301,L]


However, researching HTTP Status codes I'm still unsure with the 301:

301 Moved Permanently
This and all future requests should be directed to the given URI.


or the 403:

403 Forbidden
The request was valid, but the server is refusing action. The user might not have the necessary permissions for a resource, or may need an account of some sort.


So which is the proper usage case and am I using the proper rewrite rule? Is there a proper HTTP status code to use to stop bad referrers?

In my research I ran across:


Redirect based on referrer domain
How do you block a referer but for a specific URL using .htaccess?
Can I block a referring site from my web site
How to block a referrer's full URL and not only the domain
How to block referral traffic from multiple referrers and subdomains in .htaccess file?


In case the next person that runs across this is curious.

10.01% popularity Vote Up Vote Down


Report

 query : Are automatically generated spam backlinks that I didn't create likely to be a negative SEO attack from a competitor? In last 10 days, my website traffic suddenly increased. Yesterday I noticed

@Voss4911412

Posted in: #Backlinks #GoogleSearchConsole #NegativeSeo #Seo #SeoAudit

In last 10 days, my website traffic suddenly increased. Yesterday I noticed that my domain authority value and my website backlinks also increased. I reviewed my backlinks in ahrefs.com and I noticed that in last 6 days more than 2000+ backlinks were generated, most of the backlinks are in the comments section of WordPress websites (i.e., daily new 300+ spam backlinks are automatically generated).

I wonder how my website links appear in the comment section, I didn't even do that. My website links listed as Pingback in some website, some of my website links are listed as the trackback and some of my website links are appear in comment name. All of my website links are mostly using same keyword (i.e., same anchor text).

I didn't do this, can anybody tell me why this is happening?

Most of the links are no-follow link, I think some of my competitor deliberately doing this to ruining my website reputation, am I correct or not? All of the links are generated in the Comment section of WordPress websites.

I know these low-quality links are bad for my SEO, how can I overcome this?

Daily, 300+ links are generated so I can't use the Google disavow tool. If I had to do Google disavow, every day I would have to spend half of my time to verify 300+ links to create a .txt file for Google disavow.

These Backlinks are not updated in Moz Open Site Explorer, so that I can't use Spam Analysis to generate the text file for 8+ Spam Flags.

10.03% popularity Vote Up Vote Down


Report

 query : Correct type for a top 10 photographers list I recently bing through lot of reading about JSON-LD structured data types for my website which list the top 10 photographers in each city in my

@Voss4911412

Posted in: #SchemaOrg

I recently bing through lot of reading about JSON-LD structured data types for my website which list the top 10 photographers in each city in my country. I'm struggling to find a correct data type for this.

Currently, my choice is ListItem. I wanna know is that the correct one and on that type there are a lot of variations I wanna know the perfect one.

10.01% popularity Vote Up Vote Down


Report

 query : Do social media widgets have any effect on search engine optimisation? Facebook and other social media platforms have developed social media widgets which are aesthetically pleasing but have quite

@Voss4911412

Posted in: #Seo #SocialMedia

Facebook and other social media platforms have developed social media widgets which are aesthetically pleasing but have quite a lot of dynamic content being loaded. With this in mind, would it be a positive or negative effect on the website's ranking in search engine results if they have this installed on a given page?

Source: developers.facebook.com/docs/plugins/page-plugin/

10.01% popularity Vote Up Vote Down


Report

 query : Keep order status private from search engines? I searched hard before posting this question. I apologize if it is a duplicate or if this is not the correct forum. We have a homegrown shopping

@Voss4911412

Posted in: #BestPractices #Ecommerce #Security #WebCrawlers

I searched hard before posting this question. I apologize if it is a duplicate or if this is not the correct forum.

We have a homegrown shopping cart website. We do not require shoppers to register to place an order, but we still want to provide them with a way to review and check the status of their order online.

Our system generates a unique key value for each order. This key is included in a link sent to customers by email. Anyone with the link can therefore view the order status and limited information about the order.

I've noticed that search engines have been hitting our website using this private key. For example: An order with a Yahoo email address results in the order status function (with the private key) being hit by the Yahoo Slurp bot. Obviously Yahoo scraped the link from their user's email.

Two questions:


Why are search engines doing this? We have since modified our software to block detectable user agents and give them a 404.
Is the approach of emailing a link/key to customers a correct way to let them check the status of their orders without requiring login? What might be an alternative solution?


Thanks!

Edit: Addendum. Looking into how other stores handle this, I see one solution is to ask for the order number (which can be part of the link) and require the shopper to input another field, like zip code, to confirm authorization.

10.01% popularity Vote Up Vote Down


Report

 query : Re: How to access addon domain on cPanel hosting via URL when the root hosting domain is hosted on another host I've GoDaddy cPanel hosting account. The main domain is registered with GoDaddy and

@Voss4911412

I've got a Cpanel account that had an addon domain whose domain name had expired. The files were still on the site and I needed to access the site's homepage. The site was created with Wordpress.

I was able to access it via ipaddress/~account_name/folder_name/index.php
In my case, the folder name was the domain name: domain_name.com

It didn't work until I added the index.php. Using ipaddress/~account_name/domain_name.com gave me a 404 error.

10% popularity Vote Up Vote Down


Report

 query : Would there be an SEO problem with having a section of "recommended" or "popular" links? Is there any negative impact of having a "recommended" or "popular" links at the end of an article (in

@Voss4911412

Posted in: #ChangeFrequency #Links #Seo

Is there any negative impact of having a "recommended" or "popular" links at the end of an article (in the bottom of the page or in the sidebar)?

I am thinking about that because the popular "block" is changing everyday so all the pages are changing everyday, wouldn't Google be against that?

10% popularity Vote Up Vote Down


Report

 query : Re: Why are there 50% fewer pageviews and 130% bounce rate increase when upgrading to Google Analytics Universal? My old ga.js code is: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-XXXXXX-1']);

@Voss4911412

Make sure if you are upgrading from the ga.js to analytics.js that you add your domain to the Property's Referral Exclusion List.
It is done by default for new properties but needs to be done manually for GA Properties that existed prior to the Google changeover to analytics.js

With analytics.js a change in source during a session will end the current user session and start a new one. Self referrals may be a reason for some of the issues you are seeing.

If there were any virtual pageviews tracked with ga.js then they will need to be ported over to the analytics.js syntax to continue tracking.

eta, there is also an upgrade guide available in the dev docs, in the event you need to also upgrade event tracking, custom variables and so forth developers.google.com/analytics/devguides/collection/upgrade/reference/gajs-analyticsjs

10% popularity Vote Up Vote Down


Report

 query : Re: does google crawl/follow link on a page with meta robots noindex We have added following meta tag on some of our pages: <meta content="noindex" name="robots"/> Does google crawl the content

@Voss4911412

The tag tells the search engines not to include the page in their listings. As far as i'm aware you can't stop a search engine crawling your page as it will need to craw the page in order to see if you remove the tag. (If you change your mind and want the page to be listed)

10% popularity Vote Up Vote Down


Report

 query : Re: Shall we make AMP page URL SEO friendly or using Query is ok? We are planning to make AMP pages for our dynamic site. My concern, Is it ok to use QueryString to pass the values or shall

@Voss4911412

If you are trying to increase your search engine rankings it would be best practise to use an SEO friendly linking structure.

If your objective doesn't include increasing your SEO rankings and you market to your audience in other ways (directly) then any structure is fine.

What is your reason for not using SEO friendly structures, time? It comes down to what you value more, development time or search engine rankings.

10% popularity Vote Up Vote Down


Report

 query : Mails reported as spam by gmail I am using a mailgun api on my ubuntu server. All mails that I send is now falling under the spam label of my recipient. When I did a spamtest on my e-mail,

@Voss4911412

Posted in: #Gmail #Spam

I am using a mailgun api on my ubuntu server. All mails that I send is now falling under the spam label of my recipient. When I did a spamtest on my e-mail, the following report was generated and it has the X-Spam flag status set to Yes. Can anyone tell me how I can make my mails go into the recipient's inbox?

==========================================================

Summary of Results

SPF Check : pass
Sender-ID Check : pass
DKIM Check : pass

SpamAssassin Check : ham (non-spam)

Details:

HELO hostname: rs234.mailgun.us
Source IP: 209.61.151.234
mail-from: donotreply@id-pal.com

Anonymous To: ins-50f0arqa@isnotspam.com

SPF check details:

Result: pass
ID(s) verified: smtp.mail=donotreply@id-pal.com
DNS record(s):



Sender-ID check details:

Result: pass

ID(s) verified: smtp.mail=donotreply@id-pal.com
DNS record(s):



DKIM check details:

Result: pass
ID(s) verified: header.From=donotreply@id-pal.com
Selector=pic
domain=mg.id-pal.com
DomainKeys DNS Record=pic._domainkey.mg.id-pal.com



SpamAssassin check details:

SpamAssassin 3.4.1 (2015-04-28)

Result: ham (non-spam) (06.1points, 10.0 required)

pts rule name description




trust
[209.61.151.234 listed in list.dnswl.org]
3.5 BAYES_99 BODY: Bayes spam probability is 99 to 100%
[score: 1.0000]
-0.0 RCVD_IN_MSPIKE_H4 RBL: Very Good reputation (+4)
[209.61.151.234 listed in wl.mailspike.net]
-0.0 SPF_PASS SPF: sender matches SPF record
1.0 DK_SIGNED No description available.
1.0 DK_VERIFIED No description available.
0.2 BAYES_999 BODY: Bayes spam probability is 99.9 to 100%
[score: 1.0000]
0.1 HTML_MESSAGE BODY: HTML included in message
0.3 MIME_HTML_ONLY BODY: Message only has text/html MIME parts
0.1 DKIM_SIGNED Message has a DKIM or DK signature, not necessarily
valid
-0.1 DKIM_VALID Message has at least one valid DKIM or DK signature
-0.0 RCVD_IN_MSPIKE_WL Mailspike good senders
X-Spam-Status: Yes, hits=6.1 required=-20.0 tests=BAYES_99,BAYES_999,
DKIM_SIGNED,DKIM_VALID,DK_SIGNED,DK_VERIFIED,HTML_MESSAGE,MIME_HTML_ONLY,
RCVD_IN_DNSWL_NONE,RCVD_IN_MSPIKE_H4,RCVD_IN_MSPIKE_WL,SPF_PASS autolearn=no
autolearn_force=no version=3.4.0
X-Spam-Score: 6.1


==========================================================
Explanation of the possible results (adapted from

draft-kucherawy-sender-auth-header-04.txt):

"pass"
the message passed the authentication test.

"fail"
the message failed the authentication test.

"softfail"
the message failed the authentication test, and the authentication
method has either an explicit or implicit policy which doesn't require
successful authentication of all messages from that domain.

"neutral"
the authentication method completed without errors, but was unable
to reach either a positive or a negative result about the message.

"temperror"
a temporary (recoverable) error occurred attempting to authenticate
the sender; either the process couldn't be completed locally, or
there was a temporary failure retrieving data required for the
authentication. A later retry may produce a more final result.

"permerror"
a permanent (unrecoverable) error occurred attempting to
authenticate the sender; either the process couldn't be completed
locally, or there was a permanent failure retrieving data required
for the authentication.

==========================================================

Original Email

From bounce+c0c8a6.3acfb-ins-50f0arqa=isnotspam.com@mg.id-pal.com Sat Sep 16 05:49:14 2017
Return-path:
X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on isnotspam.com
X-Spam-Flag: YES
X-Spam-Level: ******
X-Spam-Report:
* -0.0 RCVD_IN_DNSWL_NONE RBL: no
* trust
* [209.61.151.234 listed in list.dnswl.org]
* 3.5 BAYES_99 BODY: Bayes spam probability is 99 to 100%
* [score: 1.0000]
* -0.0 RCVD_IN_MSPIKE_H4 RBL: Very Good reputation (+4)
* [209.61.151.234 listed in wl.mailspike.net]
* -0.0 SPF_PASS SPF: sender matches SPF record
* 1.0 DK_SIGNED No description available.
* 1.0 DK_VERIFIED No description available.
* 0.2 BAYES_999 BODY: Bayes spam probability is 99.9 to 100%
* [score: 1.0000]
* 0.1 HTML_MESSAGE BODY: HTML included in message
* 0.3 MIME_HTML_ONLY BODY: Message only has text/html MIME parts
* 0.1 DKIM_SIGNED Message has a DKIM or DK signature, not necessarily
* valid
* -0.1 DKIM_VALID Message has at least one valid DKIM or DK signature
* -0.0 RCVD_IN_MSPIKE_WL Mailspike good senders
X-Spam-Status: Yes, hits=6.1 required=-20.0 tests=BAYES_99,BAYES_999,
DKIM_SIGNED,DKIM_VALID,DK_SIGNED,DK_VERIFIED,HTML_MESSAGE,MIME_HTML_ONLY,
RCVD_IN_DNSWL_NONE,RCVD_IN_MSPIKE_H4,RCVD_IN_MSPIKE_WL,SPF_PASS autolearn=no
autolearn_force=no version=3.4.0
Envelope-to: ins-50f0arqa@isnotspam.com
Delivery-date: Sat, 16 Sep 2017 05:49:14 +0000
Received: from rs234.mailgun.us ([209.61.151.234])
by localhost.localdomain with esmtp (Exim 4.84_2)
(envelope-from )
id 1dt5yb-0000ss-Ll
for ins-50f0arqa@isnotspam.com; Sat, 16 Sep 2017 05:49:13 +0000
DKIM-Signature: a=rsa-sha256; v=1; c=relaxed/relaxed; d=mg.id-pal.com; q=dns/txt;
s=pic; t=1505540948; h=Content-Transfer-Encoding: Content-Type:
MIME-Version: To: From: Subject: Date: Message-ID: Sender;
bh=EzxR/OZhGy4blBn2yf7u2GfZYRITPz2eUwUOSlKVSIM=; b=Oazr5OKAjFD96m/i0dL8jgJR4c9XkGBtQEN+WaZDC2/PYfoLLpwrbNNneEhKaUUbmmOlIP2U
RcR8zA2fwlMlB7QuxESRjkEJdFkg0QMWIEe0C4qbC74PMpMZX721SqQi74599PXnwSP+0DVk
+q6M3xkZP9SCwtD3QWRmHOB4WE0=
DomainKey-Signature: a=rsa-sha1; c=nofws; d=mg.id-pal.com; s=pic;
q=dns; h=Sender: Message-ID: Date: Subject: From: To: MIME-Version:
Content-Type: Content-Transfer-Encoding;
b=U/JvTgPEay08uDDgWwrS+aZcw8o2See6cwIkt7vUkgF8Ek2BqbfUpyjJfPVgPXsOWoF1yI
EMIg1a3vnZam5weMg6l1oo+sXpcq/7lRHecRPAzX1idtp7p82WYx5ErjlgT2ds3KSMT6cCnh
JSZaIjEigAAC7RU6vyC3Qn/MJg8LI=
Sender: donotreply=id-pal.com@mg.id-pal.com
X-Mailgun-Sending-Ip: 209.61.151.234
X-Mailgun-Sid: WyJiMTI1MCIsICJpbnMtNTBmMGFycWFAaXNub3RzcGFtLmNvbSIsICIzYWNmYiJd
Received: from client.id-pal.com (ec2-52-210-201-175.eu-west-1.compute.amazonaws.com [52.210.201.175])
by mxa.mailgun.org with ESMTP id 59bcbb53.7fbbe811c340-smtp-out-n02;
Sat, 16 Sep 2017 05:49:07 -0000 (UTC)
Message-ID: <716baa8522e2fdd3becc4953ff3e747c@client.id-pal.com>
Date: Sat, 16 Sep 2017 06:49:06 +0100
Subject: business app
From: ID-Pal
To: ins-50f0arqa@isnotspam.com
MIME-Version: 1.0
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: quoted-printable
X-DKIM-Status: pass (mg.id-pal.com)

10% popularity Vote Up Vote Down


Report

 query : Re: What is adding p, subid, and uid variables to the query string of a site url? One of our sites is seeing ~100 sessions a day landing on the home page with a query string like the following

@Voss4911412

Have been seeing a large number of these as well. Recorded hundreds of hits without a single interaction on a site with a typical bounce rate of less than 10%. It's a robot.

There's no single IP but they all appear to be US based. Few weeks ago it was quite rare. Now I can get a couple of hundred hits per hour. If it's a virus, it's getting gradually bigger.

Would be keen to know if anyone has information on this.

10% popularity Vote Up Vote Down


Report

 query : Why is the php.ini timezone setting is not used to determine when a PHP cron job runs? I use shared hosting at Bluehost and changed the default date.timezone to UTC in my php.ini. Everything

@Voss4911412

Posted in: #Bluehost #Cron #Php #Timezone

I use shared hosting at Bluehost and changed the default date.timezone to UTC in my php.ini. Everything works fine with my PHP script and date() return the UTC time as expected.

The problem occurs with the crontab, because when I enter a cron job to be executed every Monday at 00:15 AM UTC with the following line :

15 0 * * 1 php /home2/mywebsite/public_html/php/myscript.php


Then the script is not executed at 00:15 AM UTC time but at 00:15 AM UTC-6 which is Western America I guess. For this reason I need to set it at 06:15 PM of the day before, which is Sunday, to have it executed at Monday at 00:15 AM UTC:

15 18 * * 0 php /home2/mywebsite/public_html/php/myscript.php


I asked their support but they said they cannot help with the crontab. For them it just work.

Is it related to the shared hosting and is there a possibility to change this so that crontab service use my php.ini timezone setting ?

10.01% popularity Vote Up Vote Down


Report

 query : Slow PHP redirects lead to loss of marketing clicks Ok so we are running liquid web server For some reason we are losing around 70% of all of our clicks that we are tracking on our end

@Voss4911412

Posted in: #ClickTracking #Marketing #Performance #Redirects

Ok so we are running liquid web server

For some reason we are losing around 70% of all of our clicks that we are tracking on our end than we are seeing on Hasoffers tracking platform.

These are mobile clicks so we expect to loss a bit but not 70%

They say it can be a number of things that is causing this but I want to break it down here and show what we have so you can take a look at this a bit better and try to help us out.

So we are running traffic to variations of this link here

alengetcheck.us

For each phone number that we have we create a different domain.

So for instance alengetcheck.us/pndq8q is associated with 12547098787 and alengetcheck.us/pn7894 may be associated with 789278299

When sending out we do not use any sort of an HTTP, HTTPS or even We just run it as is the domain and the extension.

So the problem we are having is that when we are redirecting from alengetcheck.us/pndq8q to the other page it is taking about 1.7 seconds when we use a speed test and our score is very poor due to slow speed.

So the only thing that our system does is takes a link and redirects it. Very simple as we just have to redirect to the link set inside the link in the domain.

This is the script that we are currently using to redirect the link:

<?php
function get($url, $params=array()) {
$url = $url.'?'.http_build_query($params, '', '&');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
return json_decode($response);
}

$link = get('http://aptrack.us/get_link.php', array('code'=> $_GET['code']));
echo "<script>window.location='".$link->url."';</script>";
?>


As you can see this is a very simple script and it shouldn't take 1.7 to redirect to the next page. It should take milliseconds.

So what should we do on our end now to make this go faster?

Is there anything wrong with our script or our calls that you see would cause a split second lag?

How can we reduce server response time to 200MS which is what Google is wanting from us right now?

Any help or suggestions would be great!

10% popularity Vote Up Vote Down


Report

 query : How to increase crawling and indexing of an website to get listed in Google What are ways to get google index our webpage rapidly? what activities should be performed in SEO point of view for

@Voss4911412

Posted in: #CrawlRate #GoogleIndex #Indexing

What are ways to get google index our webpage rapidly?
what activities should be performed in SEO point of view for increasing indexing.
If we have dynamic website then what should be done. What will help dynamic website to get webpages indexed in Google.

10% popularity Vote Up Vote Down


Report

 query : What is better for? Updating Old blog vs Starting New Blog I had a technology blog like Labnol. It has around 50 posts. To be honest, those are not high quality posts. It just has around

@Voss4911412

Posted in: #Google #Seo

I had a technology blog like Labnol. It has around 50 posts. To be honest, those are not high quality posts. It just has around 300-500 words and their are very general posts like "Top 10 WordPress Plugins", "Best Android Apps", "How to Download a Whole Website" etc.

I stopped updating that blog just 1 and half year back. That time it had around 2000-4000 monthly organic traffic.

Not it has just around 300 monthly organic traffic.

Now this is the question.

I have plan to start a technology blog again with better quality.

For better SEO, should I start a new blog or should I start adding new posts to old blog?

10.01% popularity Vote Up Vote Down


Report

 query : Connecting different goal types to see correlations in Google Analytics I am looking for a way to connect asset download goals early in the buying cycle with signups at the end of the cycle.

@Voss4911412

Posted in: #Analytics #Conversions #GoalTracking #GoogleAnalytics #UniversalAnalytics

I am looking for a way to connect asset download goals early in the buying cycle with signups at the end of the cycle.

My website has complex buying cycle that happens over the course of months, for which we have top, mid, and bottom of marketing funnel goals. The top of funnel goals include downloads of white papers and ebooks, whereas the middle and bottom of funnel are downloads of solution assets and sign ups for free trials, demos, etc.

I am hoping to find a way to see in Google Analytics see how many mid and bottom of funnel goals were tied to a user conversion on top of funnel resources.

Is there a way in Google Analytics to connect multiple goals to a user?

10% popularity Vote Up Vote Down


Report

 query : Should I use "-" in sub domain to separate two words? Should I use "-" in sub domain to separate two words? If I need to create sub domain called "blogging tips" should I use bloggingtips.example.com

@Voss4911412

Posted in: #Seo #Subdomain

Should I use "-" in sub domain to separate two words?

If I need to create sub domain called "blogging tips" should I use bloggingtips.example.com or blogging-tips.example.com?

10.01% popularity Vote Up Vote Down


Report

 query : Re: How important are back links for SEO in 2017? In the past most SEO training promoted getting back links. There were plenty of article submission sites for that purpose. I have been away from

@Voss4911412

they are important, for example majestic tracks all backlinks to your site.

it is also important that the backlinks are from high quality websites (e.g: edu and gov domains are the ones with most importance) and that your site follows all google webmaster guidelines.

because the quality of backlinks matters, some people tried many techniques, like for example: to create a lot of low quality backlinks to the URLs of their competitors in order to decrease their ranking.

Google developed a software to detect spammy behaviors called: Google Penguin

10% popularity Vote Up Vote Down


Report

 query : Backend of Prestashop Product Description Field hidden overflow I've got a Prestashop 1.7.1.1 wherein the backend when you edit the product description with images, it doesn't overflow properly.

@Voss4911412

Posted in: #Css #Prestashop

I've got a Prestashop 1.7.1.1 wherein the backend when you edit the product description with images, it doesn't overflow properly.

I tried forcing the overflow in the CSS file overflow:scroll!important but that didn't do anything.

Has anyone encountered this bug and is aware of a fix/workaround?

10% popularity Vote Up Vote Down


Report

 query : AdSense application rejected for "login only" content and difficult navigation, but content is public in a standard blog format I'm maintaining a blog that I hope to get accepted into the Adsense

@Voss4911412

Posted in: #GoogleAdsense #GoogleAdsensePolicies

I'm maintaining a blog that I hope to get accepted into the Adsense Program. It was rejected due to the following reasons:

Login Only: During our review of your website, we found that the majority of pages on your site are behind a login, or there is restricted access. Please note that we will not approve applications for login-protected pages, as we are not able to review their content for acceptance into the program.

Difficult site navigation: During our review of your website, we found your site difficult to navigate. Potential navigation issues include: redirects, pages behind a login or restricted access, broken links, excessive pop-ups, dialers, and pages under construction or not yet launched.

I think both of them don't make sense. None of my contents are for login only, and I don't really see that my site is particularly difficult to navigate. Am I missing anything obvious? My address is: koreanforinternauts.blogspot.ca/

10% popularity Vote Up Vote Down


Report

 query : Can i do a 301 redirect for the Googlebot Only? Lets say a have a webpage called example.com, and a have a subdomain called sellingpage.example.com I use sellingpage.example.com as a landing

@Voss4911412

Posted in: #Googlebot #Redirects #Seo

Lets say a have a webpage called example.com, and a have a subdomain called sellingpage.example.com

I use sellingpage.example.com as a landing page for all my paid traffic, this is a landing page designed to explain to the users or services and to sell them our product.

Example.com is a page where the user gets a pop-up with two options "I am already a user" / "I am not a user", if he clicks in the first option he gets prompeded to login-in, if he clicks in the second option he gets redirected to the sellingpage.example.com

Will i get punished by Google if a create a 301 redirect only for the Googlebot, so when he access example.com he gets redirected to sellingpage.example.com ?

10.01% popularity Vote Up Vote Down


Report

 query : What can I do to improve my chances of having different sitelinks? I have a website that is listed in google. The content on the main page changes every week or so with different articles

@Voss4911412

Posted in: #Google #Sitelinks

I have a website that is listed in google. The content on the main page changes every week or so with different articles being featured. I have on one side. links to things like login, timeline, about etc. When I search example.com on google(not panpact because my site doesn't rank first then for some reason) the main page comes up with 10 site links below it but they are all just links to various articles. I want to display login, timeline, about etc. There used to be the demote feature but that was removed, unfortunately. Is my site laid out in a bad way that inhibits google from listing the correct sitelinks? What can I do to improve my chances of having the right sitelinks?

10% popularity Vote Up Vote Down


Report

 query : SEO: h1 & h2 - what should go into them? I am still not quite sure today what should I put in H1 and H2 for SEO. For instance, should my site title be H1 and my article title be H2?

@Voss4911412

Posted in: #Html #Seo

I am still not quite sure today what should I put in H1 and H2 for SEO.

For instance, should my site title be H1 and my article title be H2?

<head>
<title>My First Post - Another WordPress Website</title>
</head>

<h1>Another WordPress Website</h1>

<h2>My First Post</h2>
<p>brah brah brah</p>
<p>brah brah brah</p>


Or:

<head>
<title>My First Post - Another WordPress Website</title>
</head>

<p>Another WordPress Website</p>

<h1>My First Post</h1>
<p>brah brah brah</p>
<p>brah brah brah</p>


Which is better?

10.02% popularity Vote Up Vote Down


Report

 query : Re: Trying to change web development companies - what assets do we need from the old company to change? My company is changing our web dev company, as the current one is being non-responsive. We

@Voss4911412

At minimum you will need ftp access to download your website code/media/files/etc , and also a backup of the database in sql format (.sql).

If you want to keep the domain name you will need to have the new web host transfer it.

10% popularity Vote Up Vote Down


Report

 query : Can a Web store with .eu domain sell products intended only for UK market? This has nothing to do with Brexit and I am not even sure this is the right place to ask this question. I recently

@Voss4911412

Posted in: #Domains #Legal #Webshop

This has nothing to do with Brexit and I am not even sure this is the right place to ask this question. I recently bought an electrical appliance from a Web store with the .eu domain extension, only to realize after it arrived that it has a UK power adapter. Nowhere on the product page was it mentioned that it was coming with a UK power adapter. After contacting their support about it, they replied that they mention in their Terms & Conditions page that all products are conforming to rules and regulations for the UK market. That page is the only place where they mention this. Their FAQ doesn't mention anything about it.

And my question is, aren't there any legal implications for the products of a Web shop with the .eu domain to conform to EU standards?

10.01% popularity Vote Up Vote Down


Report

Back to top | Use Dark Theme