Skip to main content

Posts

Showing posts from 2007

What works for Google Adsense?

Here are three tips on how to make your cash register ring with your Google Adsense: Maintain a Google Sitemap for your site and keep them up to date. Place a bookmark(s) for people to come back. Use only the social bookmarks that can really drive traffic. I recommend Technorati, DIGG, Del.icio.us and StumbleUpon. Make contents that people really want to see. If you find a good article to put in your website, don't copy it but do an article rewrite. If you have ways of doing things better than the usual, make an article about it. Keep on doing these things at all times. Put in two hours in the morning and two hours in the evening. That makes it 10 to 12 hours each weekend. After constantly doing this, you can really see an improvement with your Google Adsense account! Also, you can diversify... Make many sites with good quality and has useful information that serve a real purpose and fulfill a need on the web. Diversity is the key. Don't worry if your site would lo

New Linking Strategy of 2007

Focus on Direct Traffic from Referrals If you get a link, want and expect traffic. Make sure the link is located above the fold and in a very good location on the page. Focus on Indirect Traffic from increased position in the SERPs (because of high rankings in search engines) Make sure you have already gone and tested which keyword phrases which converts best on your site. It will force you to use that anchor text in a great linking campaign and great ranking, great traffic, and great results in terms of sales. Let go of old, outdated methods. The old reciprocal linking methodology will not work today. Make sure that you only exchange links with relevant sites. Why is Linking Important? In order to gain Link Popularity and Link Reputation! Which is which? Text links and Image links? In terms of link popularity and link reputation, the image links do little for you. What would boost you is the text within a link called anchor text. It's not going after the biggest and

Tips in Advertising on Craigslist

Define Craigslist It is a centralized network of online urban communities, featuring free classified advertisements (with jobs, internships, housing, personals, for sale/barter/wanted, services, community, gigs and resumes categories) and forums sorted by various topics. With over 10 million new classified ads each month, Craigslist is the leading classifieds service in any medium. The classified advertisements in Craigslist range from traditional buy/sell ads and community announcements, to personal ads and even "erotic services". It is found at the top 50 sites in Alexa Traffic Rankings. Because of its popularity, Craigslist.org is one of the most target sites of spammers. Craigslist has been upgrading its system continuously to fight against the spammers. When you post the same ads on different locations or multiple ads on a particular location, your post will be marked as flag and will be removed from the list. This makes the ad posting in Craigslist difficult

9 Step Process for Getting Authoritative Links

There are many types of authoritative links, and different sites require different approaches for getting the link. But the main objective is to build your site into a stable authoritative domain. Here is a summary of the 9 step process to getting authoritative links through relationship building: Build a list of authoritative sites that you would really like to get a link from. Review the content and tools you currently have, or can easily develop. Take your highest priority targets and do some research. Really understand what the site is about. Make your first contact totally focused on helping them somehow, such as solving that stated need from the prior point. Simply help them, and ask for nothing in return. Do it again, or perhaps even several more times. Along the way make sure that they are aware that you have a related site (perhaps the site name is in your signature, for example). After you have established a dialog with them, and they are really responding to yo

Character Sets Issue

Character Sets : We know we need an encoding tag for compliancy purposes, but which one? <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> If a site is doing is serving content, using UTF-8 is fine. By the way, ISO-8859-1 is actually a subset of UTF-8. Where you can get into trouble is when you are *accepting* data from visitors from some foreign countries. The browser will re-encode any data the user enters into UTF-8 and send it back to the server. If your server can't handle UTF-8 encoded data, then you could run into some trouble. For example, if you are using a database backend and a user from China enters some Chinese characters into a form, then your database needs to know how to handle the UTF-8 encoded data it just received. Any database will accept UTF-8 data because it is "byte oriented", but knowing what

List of Ping URLs

Here's a list of ping URLs whenever you got new updates in your website: http://api.feedster.com/ping http://api.moreover.com/ping http://api.moreover.com/RPC2 http://blog.goo.ne.jp/XMLRPC http://blogdb.jp/xmlrpc/ http://coreblog.org/ping/ http://ping.blo.gs/ http://ping.bloggers.jp/rpc/ http://ping.cocolog-nifty.com/xmlrpc http://ping.syndic8.com/xmlrpc.php http://ping.weblogalot.com/rpc.php http://pinger.blogflux.com/rpc http://rpc.blogrolling.com/pinger/ http://rpc.icerocket.com:10080/ http://rpc.pingomatic.com/ http://rpc.technorati.com/rpc/ping http://rpc.weblogs.com/RPC2 http://topicexchange.com/RPC2 http://www.blogdigger.com/RPC2 http://xping.pubsub.com/ping

Changes to Track in a Website

The topic about a website’s changelog is one of the interesting threads in the WebmasterWorld! The changelog has a vital role in SEO. Some may find it unimportant, but I believe this should be one of the things that we should be checking out once in a while. Here are some ideas to track taken from the WebmasterWorld : Every change to robots.txt Every change to htaccess (or Internet Services Manager in IIS) Site-wide template changes (especially menu changes) DNS and hosting changes New outbound links Ad purchases and run-times Last time all outbound links were checked to see what was on the other end Last time site was checked for "poison" words Change in titles, especially the home page Server updates (especially reboots or outages) New inbound links (one way or not) Backup of files you have changed Log of expense for each website Config files (Apache, MySQL, php, etc.) Firewall ruleset All changes to all pages are logged one page per day I

DTD Statements

A DTD (Document Type Definition) Statement will go before the <HEAD> section of your web pages. This isn't just for your home page, but every page of your site. A statement allows your site to get indexed faster and deeper in Google. Using it also shortens the time your site spends in the Sandbox by an average of 12 days. However, using an incorrect DTD Statement can hurt your site. For example, if you use a 4.01 statement, but your code was written in 3.2, the engine will make note of it and there is a strong likelihood that the page (or site) will be dropped or penalized. Why It Works? When GoogleBot comes to your site and sees the DTD Statement, it knows how the code was written. Because it knows how the code is written, it can spider the site much faster. If every web page was compliant, Google could perform their massive monthly update and recrawl of the web much faster than the 4-5 weeks it currently takes. The strongly recommend is the Transitional tag. It allow

Social Bookmarking

A trend that has spawned countless extremely popular web services. Among some of the more popular ones are Technorati, Del.icio.us and Flickr. Basically, Social Bookmarking is the collective act of bookmarking (aka. "tagging") and sharing of internet links and resources by the internet population... you! Different Social Bookmarking Services on the internet today allow people to upload, store, bookmark ("tag") and share everything from photos to news stories to links of interest. Social bookmarking involves the use of Tags - subjects, categories, or words assigned to various Objects. An Object is often a link (URL) to a particular webpage relevant to the Tag it is assigned to. However, an Object can also be any relevant piece of data or information (like MP3 files for instance). I recommend the Tag Generator & Social Bookmark Link Creator ! Just provide the following info: Keywords - separated by commas Article URL - the complete URL of the articl

Google Sitemap Errors

Here are a list of Google Sitemap errors straight from the Webmaster Tools of Google. Compression error Empty Sitemap Invalid attribute value Invalid date Invalid tag value Invalid URL Invalid URL: We've detected that a Sitemap you've listed doesn't include the full URL. Invalid XML: too many tags Missing XML attribute Missing XML tag Nested indexing Parsing error Temporary error Too many Sitemaps Too many URLs Unsupported file format URL not allowed Paths don't match: We've detected that you submitted your Sitemap using a URL path that doesn't include the www prefix. Paths don't match: We've detected that you submitted your Sitemap using a URL path that includes the www prefix. Incorrect namespace Leading whitespace A specific HTTP error ======== Additional Info from Google: Learn about sitemaps Build and submit a sitemap Updated: May 17, 2018

Google Sitemap Sample

The following sample shows a Google Sitemap code in XML format: <?xml version="1.0" encoding="UTF-8"?> <urlset    xmlns="http://www.google.com/schemas/sitemap/0.84"    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"    xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84    http://www.google.com/schemas/sitemap/0.84/sitemap.xsd"> <url>    <loc>http://www.seocebu.com/</loc>    <lastmod>2007-04-02T03:06:58+00:00</lastmod>    <changefreq>daily</changefreq>    <priority>1</priority> </url> <url>    <loc>http://www.seocebu.com/category/announcements/</loc>    <lastmod>2007-04-02T11:06:58+00:00</lastmod>    <changefreq>weekly</changefreq>    <priority>0.9</priority> </url> <url>    <loc>http://www.seocebu.com/contact/</loc>    <lastm

Google Sitemap Format

The Google Sitemap Protocol format consists of XML tags. All data values in a Sitemap must be entity-escaped. The file itself must be UTF-8 encoded. The Sitemap must: Begin with an opening <urlset> tag and end with a closing </urlset> tag. Include a <url> entry for each URL as a parent XML tag. Include a <loc> child entry for each <url> parent tag. The available XML tags are described below. <urlset> Required. Encapsulates the file and references the current protocol standard. <url> Required Parent tag for each URL entry. The remaining tags are children of this tag. <loc> Required URL of the page. This URL must begin with the protocol (such as http) and end with a trailing slash, if your web server requires it. This value must be less than 2048 characters. <lastmod> Optional The date of last modification of the file. The accepted date formats are: Complete date: YYYY-MM-DD (eg 2007-03-16

Resubmit Google Sitemap

When the sitemap file of your website changes, you can resubmit it to Google to let them know about the updates of your pages. Resubmit your Sitemap in one of two ways: Sign into Google Sitemaps with your Google Account and from the "Sitemaps" tab, select the checkbox beside it and click the "Resubmit Selected" button. The Submitted date will update to reflect this latest submission. Send Google an HTTP request. If you do this, you don't need to use the "Resubmit" link in your Google Sitemaps Account. The Submitted column will continue to show the last time you manually clicked the link, but the Last Downloaded column will be updated to show the last time the Google system has fetched your Sitemap. To resubmit your Sitemap using an HTTP request: Issue your request to the following URL: www.google.com/webmasters/tools/ping?sitemap=sitemap_url For example, if your Sitemap is located at http://www.example.com/sitemap.gz, your URL will b

Google Sitemap Limits

You can provide multiple Sitemap files, but each Sitemap file that you provide must have no more than 50,000 URLs and must be no larger than 10MB when uncompressed. These limits help to ensure that your web server does not get bogged down serving very large files. If you want to list more than 50,000 URLs, you must create multiple Sitemap files. If you anticipate your Sitemap growing beyond 50,000 URLs or 10MB, you should consider creating multiple Sitemap files. If you do provide multiple Sitemaps, you can list them in a Sitemap index file. A Sitemap index file can list up to 1,000 Sitemaps.

Add Google Sitemap

The first time you submit a Sitemap to Google (whether you created it using the Sitemap Generator, a third-party tool, or manually created it), you must submit it by adding it to your Google webmaster tools account. This enables Google to provide you with useful status and statistical information. From your account, you can see if there are problems with your Sitemap or with any of the URLs listed in it. If you are adding a Mobile Sitemap, you must specify the markup language for the URLs contained in the Sitemap. When you make changes to your Sitemap, you can resubmit it using your Google webmaster tools account or you can resubmit it using an HTTP request. These steps describe how to add a Sitemap that contains URLs for non-mobile content (this includes most websites). There are additional steps for Mobile Sitemaps. Once you have a Sitemap in one of the supported formats: Upload your Sitemap to your site in the highest level directory you want search engines to crawl, ge

Fix Supplemental Index in Google

Need help on how to fix the supplemental index in Google ? Here are questions for you to analyze about the contents in all your web pages: Do you refresh it regularly? Are you careful not to over seed it with keywords? Does the content reflect and support the headlines, page title, and meta tags? Do your links conform to the best-linking practices described in the other articles in the Linking School? Are you sure that none of your links leads to questionable domains or bad neighborhoods? Do you add new links on a steady, incremental basis? If you are getting out-ranked by your own content pages, then it is very possible that you are in Google's Supplemental Index! You might be asking two simple questions: How did I get into the supplemental index? How do I get out of Google's supplemental index? Here are my speculations in your web pages: You have little unique text on your web pages (maybe a lot of images, and little text) Duplicate content - cop

www and non-www URL Issues

To avoid canonical issues, you need to use a mod_rewrite string. A simple 301 works on single URLs but does not handle the site-wide canonical issues. You can go to your browser and type in "seocebu.com" and you will be redirected to "www.seocebu.com". Here is a code direct from our HTAccess file: <IfModule mod_rewrite.c>    RewriteEngine On    RewriteBase /    RewriteCond %{HTTP_HOST} !^www\.seocebu\.com [NC]    RewriteRule ^(.*)$ http://www.seocebu.com/$1 [L,R=301] </IfModule> Google penalizes for duplicate content. If they have two versions of one of your pages indexed, the www and non-www version, you are tagged with duplicate content, even though it is just one page. Also, this will help consolidate your link popularity as well. If you allow for canonicalization to occur on your site, you will have one level of link popularity for the www version and another level for the non-www version. This could even push your PageRank up in 3-4

Hide Javascript and CSS Codes

Make sure that you send JavaScript and CSS to external files. Therefore, hide your javascript and css codes! There is no reason to clutter up your code and increase your page sizes. If you have the same JavaScript for a menu and the same CSS on each page, not only is that taking up space on your server, your transfer rate will be higher than it needs to, but you are also cluttering up Google's index. With over 110,000 servers running their index, Google estimates that with duplicate content, scraper sites, repeated JavaScript and CSS information, their index size is 28% larger than it has to be. That is about 30,000 servers. Sample of external JavaScript code: <script src="ExternalJS_1.js" language="javascript" type="text/javascript"> </script> Sample of external CSS code: <link href="style.css" rel="stylesheet" type="text/css"> Avoid CSS tricks such as keyword stuffing, etc.

Power of Robots.txt File

The robots.txt file was designed to inform bots how to behave on your site. What information they can get, what information they can’t. It is a simple text file that is very easy to create, once you understand the proper format. This system is called the Robots Exclusion Standard. To create your robots.txt file, use the Notepad or another text editor. DO NOT create your robots.txt file in an HTML Editor like DreamWeaver, GoLive or FrontPage. FTP clients usually convert the file into Unix mode, but there are occasions when it will fail. The two parts of a robots.txt file: User-agent - This line specifies the robot. For example: User-agent: googlebot You may also use the wildcard character " * " to specify all robots. For example: User-agent: * You can find user agent names in your own logs by checking for requests to robots.txt. Most major search engines have names for their spiders. Disallow – this consists of Disallow: directive lines. Just because the Disallow stat

Wordtracker Keyword Tool

The Leading Keyword Research Tool is Wordtracker ! With Wordtracker, you'll know the best keywords to drive more traffic to your sites. Here are some terms that you will come across with this tool: KEYWORD EFFECTIVENESS INDEX (KEI) This compares the Count result (number of times a keyword has appeared in the Wordtracker data) with the number of competing web pages to pinpoint exactly which keywords are most effective for your campaign. The higher the KEI, the more popular your keywords are, and the less competition they have. Which means you have a better chance of getting to the top. Count This shows the number of times a particular keyword has appeared in the Wordtracker database. 24hr This is the predicted daily traffic for each keyword in this search engine only. Competing Each keyword has been submitted to the search engine and the number of competing web pages given in response. The lower the competition the easier you will find it to reach the

Recommended Free Keyword Tools

Here are some recommended free keyword tools for you to use: Google Keyword Tool The Google Keyword Tool generates potential keywords for your ad campaign and reports their Google statistics, including search performance and seasonal trends. Start your search by entering your own keyword phrases or a specific URL. Overture Keyword Selector Tool Try the Overture Keyword Tool now!

Free Wordtracker Keywords

The Free Wordtracker Keywords generate up to 100 related keywords and an estimate of their daily search volume. But, the adult terms have been removed from the results. Visit the Free Keyword Suggestion Tool from Wordtracker. Here is the formula Wordtracker used to estimate the daily search volume for any given keyword: Legend: A = Number of times keyword appears in the Wordtracker database B = Total number of keywords in the Wordtracker database C = Estimated total number of daily queries on all search engines* D = Estimated daily volume of searches for the keyword (A / B) x C = D So for a keyword that appears 9,000 times in the Wordtracker database, this is the number of daily searches that they predict: Let, A = 9,000 B = 319.4 million C = 563.4 million* D = 15,873 daily searches Therefore: (9,000 / 319.4 million) x 563.4 million = 15,873 daily searches Every day, on average, Wordtracker collects about 3.55 million search terms fro

Fake PageRank Detection Tool

Domain Names are big business these days, and PageRank fraud is becoming a serious problem. When someone is offering a domain name for sale, they can artificially inflate the domain's PR. It is achieved when offending domains use a 301 or 302 redirects that point their sites to sites with a high PageRank. This is a well known trick in the SEO world, but people who aren't familiar with SEO are most likely to fall for this. Use the Fake PageRank Detection Tool to check and see the real PageRank of a site now!

Google PageRank Prediction

iWebTool’s Google PageRank Prediction tool does what it says, it predicts your future Google PageRank. This tool will provide an estimation of the future PageRank and should not be considered precise. Go to iWebTool and try their Google PageRank Prediction Tool. You can also read about the prediction for the next Google PageRank update at SeoCompany.ca !

Inbound Link Quality Rating Tool

Information about the Firefox ILQ Rating Tool Extension : Provides you the ILQ (Inbound Link Quality) rating of the current site showing in your browser. A web-based ILQ Rating Tool This tool shows the number of inbound links in Yahoo, DMOZ, gov and edu that a particular site you can see on your browser has. v1.1 ILQ Firefox extension now works with Firefox v2.0 Download FireFox ILQ 1.1 Rating Extension from SEOcompany.ca !

SEO for Firefox

The SEO for Firefox tool was designed to add more data to Google and Yahoo! to make it easier to evaluate the value and competitive nature of a market. SEO for Firefox pulls in many useful marketing data points to make it easy get a more holistic view of the competitive landscape of a market right from the search results. In addition to pulling in useful marketing data this tool also provides links to the data sources so you can dig deeper into the data. If you are casually surfing please turn this extension off. Only turn it on if you are actively researching a market. In the status bar at the bottom of Firefox you can click the SEO for Firefox logo to turn it on or off. If it is colorful it is on. If it is gray it is off. This extension also has courtesy settings which allow you to ping search engines for data at a slow rate. You probably want to set the delay to at least 1 or more seconds. If you set it at 0 do not be surprised if some of the engines at least tempora

Search Status Tool

SearchStatus is a toolbar extension for Firefox and Mozilla that allows you to see how many and every website in the world is performing. Designed for the needs of search engine marketing professionals, this toolbar provides extensive search-related information about a site, all conveniently displayed in one discreet and compact toolbar. SearchStatus lets you view its Google PageRank, Google Category, Alexa Traffic Ranking, Alexa Incoming Links, Alexa Related Links and backward links from Google, Yahoo! and MSN. This combined search-related information means you can view not only the link importance of a site (according to Google), but also its traffic importance (according to Alexa), so providing a balanced view of site’s effectiveness. The SearchStatus Mozilla/Mozilla Firefox extension appears discreetly at the bottom of the browser on the status bar. If you choose to view backward links for a particular page, they open in new tabs in the same browser window. Disable the au

Useful Meta Tags in SEO

Here are the useful meta tags in SEO : <meta name="description" content="Description goes here."> The meta description tells the site's description. The value of this meta is usually used in the SERPs but this is not true most of the time. Make your site description compelling! And its recommended that this should be with a maximum of 20 words. <meta name="keywords" content="your 1st keyword, your 2nd keyword"> The meta keyword tells the bots that these are your targeted keyword. Google uses this as a reference for spamming within your page. It is recommended to consist with a maximum of 20 words also. <meta name="robots" content="oodp"> If you use the meta-robots tag, don’t use the "index,follow" parameter. That is what a search bot does by default. <meta http-equiv="content-type" content="text/html; charset=ISO-8859-1"> Read more about this from Character Sets In

Useless Meta Tags in SEO

Here are some useless meta tags in SEO : <meta name="robots" content="index,follow"> This meta tells all the robots that visits the site to index the page and follow all the links in the page. To tell the bots what to do: index, follow - index the page, follow links noindex, nofollow - don't include the page in your index, don't follow links index, nofollow - index the page, don't follow links noindex, follow - don't include the page in your index, follow links <meta name="page-topic" content=""> <meta name="abstract" content=""> <meta name="author" content=""> <meta name="revisit-after" content="1"> This meta tells the bot to visit every after the # of days being mentioned. Google ignores this meta. When your website does not do tricks and has good contents, Google will visit your website everyday! <meta na

Harmful Meta Tags in SEO

Here is a harmful meta tag in SEO : <meta http-equiv="refresh" content="0.5;URL=../index.htm"> This meta is used to redirect a page and the time for redirection will be mentioned. If you want to redirect a page, read about 301 Redirect. If you have some things to add, please write it in a form of a comment.

Specific Guidelines of Google on Quality

Here are the Specific Principles of Google when it comes to Quality: Avoid hidden text or hidden links. Don't employ cloaking or sneaky redirects. Don't send automated queries to Google. Don't load pages with irrelevant words. Don't create multiple pages, subdomains, or domains with substantially duplicate content. Don't create pages that install viruses, trojans, or other badware. Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content. If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first. If a site doesn't meet our quality guidelines, it may be blocked from the index. If you determine that your site doesn't meet these guidelines, you can modify your site so that it does and request reinclusion.

Basic Principles of Google on Quality

Here are the Basic Principles of Google when it comes to Quality: Make pages for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking." Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?" Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links. Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does

Google Quality Guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. If you believe that another site is abusing Google's quality guidelines, please report that site at http://www.google.com/contact/spamreport.html. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam at

Define Supplemental Index

Google has two indexes: The Main Index and The Supplemental Index. The Supplemental Index is an area where there are pages that Google had difficulty: completely indexing, the page is "stale", or a variety of other issues. You can also look at the Supplemental Index as the last "holding ground" for a page before it gets dumped from the index completely. How can you tell if your site has pages listed in the Supplemental Index? Here’s a Google hack that works as of the moment: site:www.domain.com *** -view Make sure you have a space before and after the three asterisks. If the hack isn't working, you have to drill down to the end of the site list (site:www.domain.com), and any pages in the Supplemental Index will have in green "Supplemental Result" listed. Supplemental Index is NOT the Google Sandbox that new sites get thrown into. This is something completely different and sites young and old get pages thrown into the Supplemental Inde

Define Supplemental Result

According to Google , a supplemental result is just like a regular web result, except that it's pulled from our supplemental index. We're able to place fewer restraints on sites that we crawl for this supplemental index than we do on sites that are crawled for our main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in our main index; however, it could still be crawled and added to our supplemental index. If you're a webmaster, please note that the index in which a site is included is completely automated; there's no way to select or change the index in which a site appears. Please also be assured that the index in which a site is included doesn't affect its PageRank. Read more about Supplemental Index to understand this...

Define Google Sitemap

Google requires a sitemap file to websites for them to tell all the pages, information about those pages, which pages are the most important and how often they change. By submitting a Sitemap file to Google, you can take control of the first part of the crawling/indexing processes and that is to let Google discover your pages. This may be particularly helpful if your site has dynamic content, pages that aren't easily discovered by following links, or if your site is new and has few links to it. Sitemaps helps speed up the discovery of your pages, which is an important first step in crawling and indexing your pages, but there are many other factors that influence the crawling/indexing processes. Sitemaps lets you tell Google the information about your pages (which ones you think are most important, how often the pages change), so you can have a voice in these subsequent steps. Other factors include how many sites link to you, if your content is unique and relevant, if Google

Google PageRank

PageRank™ is the heart of Google's software. This is a system for ranking web pages and it continues to provide the basis for all of Google's search tools. PageRank is a link analysis algorithm which assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is also called the PageRank of E and denoted by PR(E). According to Google, PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that cas

Define Black Hat SEO

Black Hat search engine optimization is the process of activities used to get higher search engine rankings in an unethical manner. Black Hat SEO is characterized in: Breaking search engine rules and regulations. Creating a poor user experience directly due to the bad techniques utilized on the website. Unethical in presenting contents either in a visual or non-visual way to search engine robots and search engine users. The Black Hat SEO practices will actually provide short-term gains in terms of rankings. But if you are discovered utilizing these spam techniques on your website, you run the risk of being penalized by search engines. Black hat SEO is a short-sighted solution to a long-term problem, which is creating a website that provides both a great user experience and all that goes with that. Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such pen

Define SEM

A brief definition of SEM: Is the abbreviation of Search Engine Marketing Is a set of marketing methods to increase the visibility of a website in search engine results pages (SERPs). Has three main methods which are Search Engine Optimization, Search Engine Advertising (commonly known as Pay Per Click), and Paid Inclusion.

SEO Defined and Its Key Components

Search Engine Optimization (SEO) is the process of improving visibility and traffic of a website in search engines through organic and/or unpaid results. When done correctly (on-page SEO) and with consistency (off-page SEO), this practice contributes to the increase of quantity and quality of traffic going to a website plus giving users with a better experience overall. Organic search results are listings on SERPs (Search Engine Ranking Pages) that appear because of their relevance to the search terms which doesn’t include any pay when advertising. What are the Key Components of SEO Technically, SEO is categorized into two groups: On-page SEO and Off-page SEO. On-page SEO The on-page search ranking factors are those that are almost entirely within the publisher’s own control. This refers to every aspect that involves optimizing the actual web pages for search. Off-page SEO  The off-page search ranking factors are those that publishers do not directly control. This refers