: Why use pingler or pingomatic type of services when I could build it myself? Not sure if I'm asking in the right section, I usually hang around stackoverflow. I have a website that gets
Not sure if I'm asking in the right section, I usually hang around stackoverflow.
I have a website that gets between 10-30 new articles a day written by our team. We like to automate as much as possible and have the building capacity to do so.
One thing that stands out is people saying pinging pages using pingler or pingomatic is a great for ranking, but there's usage limits to those services, and it's manual..
How does one ping a site or search engine ? The only ping I know of is command line ping.. Is it the same thing?
More posts by @Twilah146
1 Comments
Sorted by latest first Latest Oldest Best
Your thinking is a bit on the opposite side, I'll explain based on your questions:
The only ping I know of is command line ping.. Is it the same thing?
The command-line ping checks to see if a domain exists and if it does, an IP address is given.
One thing that stands out is people saying pinging pages using pingler or pingomatic is a great for ranking, but there's usage limits to those services, and it's manual.. How does one ping a site or search engine?
With programming, but if you're trying to actually have your articles ranked high in search engines, pinging them is not exactly the right way to go.
I have a website that gets between 10-30 new articles a day written by our team. We like to automate as much as possible and have the building capacity to do so.
In the old days, a link would have been available where you would submit a sitemap about your site (which basically is a set of files that explain the structure of your pages).
What I recommend you do is sign up for webmaster tools for the search engines you want to rank for. Google and Bing each have their own tools. There are settings in each that allow you to define the crawl rate (the speed in which those search engines check out a webpage on your site automatically to rank them). To help the search engines out, you should submit a sitemap to each one so they can get to know the links much faster than crawling through the homepage to figure out where the links are.
Once you done this, the rest is automatic if you're ok with semi-slow-automatic because as I said, the search engines will crawl your site on a regular basis. In Google Search Console, I think their lowest speed is 1 request every 3 or 5 seconds.
What I tend to do which is what you should do to get your site out there more (especially if you add tons of links at once) is to update your sitemap every time your pages update, because search engines can crawl those files as well just like other pages on your site. If you want this process automatic, then you need to name your sitemap files in a programmatic way. For example, have the same filename for the sitemap index file, then for the individual sitemap files, name them 1.xml, 2.xml, 3.xml, etc instead of jack.xml, john.xml, 123.xml.
This is where you go to learn about sitemaps: www.sitemaps.org/index.html
And this is where to go for code to make your sitemap files proper: www.sitemaps.org/protocol.html
You'll be interested in the section "Using Sitemap index files (to group multiple sitemap files)" which is about 1/3 way down the page.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.