Welcome To Fusion SEO Delhi

Fusion SEO Delhi provides best SEO Services and web Development Services in India and also Worldwide.

What is SEO?

SEO is acronym for “Search Engine Optimization”. It is a process to improve your websites visibility on the famous search engines like Google, yahoo, Bing etc.

What We are?

We are dedicated to our work and very professional towords our clients. We provide services till 100% clients satisfied.

Our Team

We have a small and diffrent type of professional team working for online development and Marketting.

Our customer Support.

We provide a dedicated customer support to our clients so that we can make a good professional link.

Welcome to Fusion SEO Delhi

Fusion SEO Services Provider is one of the best SEO service provider situated in Delhi. We provide our services not in Delhi but all over the India and World Wide. We also provide new website creation and their maintenance services to our clients. We set up a basic campaign structure for our new customers regarding SEO services and how these are helpful to expand their business.

We pride ourselves on being a friendly and approachable, creating long lasting relationships with our clients. Our high quality and innovative approach in addition to our affordable prices separates us from typical web design and software companies.We believe in 100 % satisfaction of our customers. Thus we provide quality services to our clients. We have so many satisfied customers all over the India.

SEO Tips - How To Best Use Forums & QA Sites Effectively for Improving SERP


SEO strategy is crafted around many activities, including, website optimization, on page optimization, off page optimization, and more. The off page optimization is one of the most important SEO activities. A professional SEO performs many activities as part of off page optimization. One of the most effective off page SEO activities is participating in forums and QA (Questions Answer) websites such as Quora. However, a majority of SEO people still uses an old schooled tactic of posting 3 posts in a forum and get the benefits of Forum signature. This is an obsolete and outdated technique and doesn’t provide required benefits. In fact, this way the SEO executives open the doors of possible Google penalty this will harm website and its rankings in the Google. This article will share a few quick tips and best practices an SEO professional can practice to gain benefit of Forum posting and QA activities. Also, it will share the benefits of these activities.
Best Practices to use of the Forums and QA sites to favor your SEO activities:
Steps to be followed for effective use of forums and QA sites
  • Join relevant forums or QA sites with an active community. There is no meaning of joining a fashion forum while you are selling digital marketing services. Also, there is no meaning of joining a forum there is no audience or everyone is thriving to sell their own stuff.
  • Fill your profile with all the details. Don’t forget to add Company Website, Blog and Social profile link details
  • Find active or recent threads. There is no meaning of digging the dead threads and answering those.
  • Read the question and answers carefully. See if the question is already answered or still open to get possible solutions.
  • If the question is not answered yet, then, use your knowledge or collected information to answer the question with all possible details and reference links, if there are any. Don’t hesitate to give reference link to other websites.
  • See if anyone questioned your answer or ask to provide more details. Answer back. Take participation in the ongoing discussion.
  • Ask questions which make sense. Don’t ask questions just to start a new thread. Ask questions which arouse interest of others.
  • Actively contribute
The takeaway here is joining few yet relevant forums, QA sites and communities; be an active contributor and actively participate in discussions.

What are the benefits of above mentioned practice?
You will establish your profile as an expert. People will find you reliable for solutions. This will encourage people to generate leads when they are looking for the services offered by you.
So you will get:
  • User engagement
  • Branding
  • Increased flow of relevant visitors to your website
  • Increased lead generation
  • Increased lead conversion

What SEO professionals must stop doing?
The SEO (Search Engine Optimizer) must stop following old schooled Forum practices for off page SEO optimization as it will harm you/your client website. Below are a few commonly followed bad practices:
  • Joining hundreds of forums (relevant and irrelevant)
  • Giving one liner answer
  • Giving random answers or copy pasted answers
  • Digging dead threads
  • Never look back to the answered thread
  • Mass forum link building
Please understand that these types of activities will get your website penalized under the penguin algorithm of Google.

Advanced Insights on Google Indexing, crawling and Javascript Rendering


This blog post is a summary of the “Deliver search-friendly JavaScript-powered websites” session at Google I/O 2018, with an e-commerce lens applied plus few personal opinions thrown in. This talk is so important, I thought it was worth its own blog post. This presentation describes about how Google crawls and indexes sites, including how it deals with client-side rendered sites as made common by frameworks like React, Vue, and Angular. (Client side rendering refers to when JavaScript in the browser (the client) forms the HTML to display on the page, as distinct from server side rendering when code on the web server forms the HTML to be returned to the browser for display.)
This discussion is particularly relevant to e-commerce websites that have a Progressive Web App (PWA) built with a technology such as React/Vue/Angular that want all the product and category pages indexed.
Crawling, Rendering, and Indexing
How does Google collect web content to index? This consists of three steps that work together: crawling, rendering, and indexing.
Crawling is the process of retrieving pages from the web. The Google crawler follows <a href=”…”> links on pages to discover other pages on a site. Sites can have a robots.txt file to block particular pages from being indexed and a sitemaps.xml file to explicitly list URLs that a site would like to be indexed.
For example, an e-commerce site might put all product pages into the sitemaps.xml file in case products are not reachable by crawling. (For example, if JavaScript is used for category navigation UI, there may be no <a href=”…”> links in the HTML for the crawler to discover the product pages). An e-commerce site may also block indexing of the checkout page using robots.txt as that page does not contain valuable content to index.
To play well with crawlers, sites should have a canonical URL for each page so that a crawler can determine if two URLs lead to the same content. (A site might have multiple URLs that return the same page. One of the URLs should be nominated as the “canonical” URL.)
There was a period of time where the rage for client side rendered pages was to use ‘#’ (and ‘#!’ for an even shorter period of time) as a way to distinguish multiple pages. This worked with the “back” button in the browser history. Normally following ‘#’ links causes the browser to stay on the current page, which is the desired behavior with client side rendering. However to index PWA pages (e.g. product pages), the modern norm is to use the JavaScript browser history API, allowing different URLs to be recorded in the browser history without having to reload the current page. This is the best approach to use for your site to be indexed as Googlebot (and most other crawlers) ignore what comes after the ‘#’ character on the assumption that the ‘#’ identifies a different place to start the user on a page (the original purpose of ‘#’), not that the page will be different.
After a page is retrieved by a crawler, indexing extracts all the content to send off to the search indexes. This is also when <a href=”…”> links to other pages are identified and sent back to the crawler to add to its queue of pages to retrieve.
One useful tip – if your page uses JavaScript to capture button clicks (without <a href=”…”> markup), use <a href=”…” onclick=”…”> so the indexer will still see the URL, even though the user click will be intercepted by the onclick JavaScript handler.
Another tip – you can also use <noscript><a href=”…”>…</a></noscript> to embed other links you want crawled, but don’t want displayed.
Rendering is a step between crawling and indexing, created by the challenge of client side rendered pages. If a page is server side rendered, the crawler will have all of the content to be indexed already – no further rendering is required. If the page is client side rendered, JavaScript must be run to form the DOM (the HTML for the page) before the indexer can do its job.
At Google, that rendering is currently done using a farm of machines running Chrome 41, a somewhat old version of Chrome. This will be updated at some stage (maybe late 2018). That means if the JavaScript on a site uses newer JavaScript features today, it will fail to render.
A second problem is client side rendering takes up more CPU. Rather than doing such rendering in real time, Google currently sends available markup immediately for indexing, and then also sends the page to a secondary queue for additional processing by running the JavaScript on the page. Spare CPU capacity is used to perform such rendering, which could result in a client side rendered page being delayed by multiple days before its content is available for the indexer. (No time guarantees are provided – you can imagine the queue getting longer if multiple major sites rolled out new PWA support at the same time.) The old version of the page is then replaced by the enriched version of the page when available. This makes client side rendering less desirable for sites with frequent updates – the index may continuously lag behind the current content. It also means crawled links to other pages on a site may take multiple crawl iterations, each one incurring a potentially multi-day delay (if the pages are not all listed in the sitemap.xml file).
Another issue with client side rendering is not all non-Google crawlers support running the JavaScript to do client side rendering. Thus some indexers may not pick up all the content on your site.
So how best to build a PWA that can also be indexed?
Server Side, Client Side, Dynamic, and Hybrid / Universal Rendering
Server side rendering, as mentioned before, is where the web server returns all the HTML ready for display. This provides a fast first page load experience for users, is very friendly to indexers, but by definition is not a PWA.
Client side rendering of pages in comparison requires for all the relevant JavaScript files to be downloaded, parsed, and executed before the HTML to display is available. There are lots of clever tools around that try to break up the JavaScript into smaller files so the code can be downloaded incrementally as the user traverses from page to page on a site. Client side rendering is often slower for the first page, but faster for subsequent pages once JavaScript and CSS files start to get cached on in the browser.
Dynamic rendering is introduced in the presentation where a web server looks at the User-Agent header and then returns a server side rendered page when the Google crawler fetches a page and a client-side rendered version for normal users. (The server side rendered page can probably be relatively plain looking, but should contain the same content as the client side rendered page.) You just look for “Googlebot” (or equivalent for other crawlers) in the User-Agent header to work out if the request is coming from a crawler. (For extra safety you can also perform a reverse DNS lookup on the inbound IP address to make sure it is coming from the Googlebot crawler.)
Hybrid / Universal rendering is also becoming more widely supported by frameworks such as React, Vue, and Angular. Hybrid rendering is where the web server performs server side rendering of the first page (resulting in faster page display in the browser, as well as simplifying the job for crawlers) then uses client side rendering for subsequent pages. Today, this is easiest to implement when the web server runs JavaScript. (Magento for example runs PHP on the server side, which makes it harder to server side render React components as planned in the upcoming PWA Studio.)
Projects like VueStorefront.io and FrontCommerce do this today, and it could be added to PWA Studio in the future or by a helpful community member.
Other Tools
There are other tools that can be worth checking out.
  • Puppeteer is JavaScript library that can control a headless version of Chrome, allowing interesting automation projects.
  • Rendertron is an open source middleware project which can act as a proxy in front of your web site, doing client side rendering and returning the resultant page.
  • The Google Search Console allows you to explore how Google indexes your site. It has a number of new tools such as “show me my page as Googlebot sees it” which is useful for debugging. It also contains a tool to see how mobile-friendly a website is (you should try out multiple pages on your website to try out). This can also be useful to see if robots.txt is blocking files you thought were not necessary, but negatively affect the rendering of a page by Googlebot. (There is a desktop tool as well.)
Other Gotchas
Some other common issues that arise when pages are crawled include:
  • If you lazy load images using JavaScript, the images may not be found and included in the image search indexes. You can consider using <noscript><img src=”…”></noscript> to include references to such images without displaying them, or embedded “Structured Data” markup on the page.
  • Infinite scroll style applications that load more content as you scroll down the page (using JavaScript) require thought as to how much of the page Googlebot should see for indexing purposes. One approach is to have the longer page, but hide it using CSS, or creating separate pages for Google to index.
  • Make sure your pages are performant. Google will timeout and skip pages that are too slow to return.
  • Make sure your pages don’t assume the user first visited the home page (to set up “browser data” or similar). Googlebot performs stateless requests – no state from previous requests is retained, to mimic what a user landing on the site will see.
Conclusions
If you care about your site being visible in search indexes such as Google, and you are going to build a PWA, you need to think about how it is going to be indexed. If you need indexes to be updated promptly, the current best practice is to have the first page of the PWA server side rendered (using Hybrid/Universal rendering). This will work across the widest range of crawlers, with an additional benefit of the first page (normally) being faster to display (a traditional weakness of pure client side rendered solutions). Luckily the major PWA frameworks have Universal rendering support to reduce the effort required to get this going, as long as you can run a web server with JavaScript support.

Length of search results snippets Decreased - Google Confirmed


Google has confirmed that only about five months after increasing the search results snippets, it has now decreased the length of these snippets. Danny Sullivan of Google wrote, “Our search snippets are now shorter on average than in recent weeks.” He added that they are now “… slightly longer than before a change we made last December.”
Google told Search Engine Land in December that writing meta descriptions doesn’t change with longer search snippets, telling webmasters back then that there is “no need for publishers to suddenly expand their meta description tags.”
Sullivan said, “There is no fixed length for snippets. Length varies based on what our systems deem to be most useful. He added, Google will not state a new maximum length for the snippets because the snippets are generated dynamically.
RankRanger’s tracker tool puts the new average length of the description snippet field on desktop at around 160 characters, down from around 300+ characters…

… while mobile characters for the search results snippets are now down to an average of 130 characters:

Here is Danny Sullivan’s confirmation:
If you went ahead and already lengthened your meta descriptions, should you go back and shorten them now? Google’s advice is to not focus too much on these, as many of the snippets Google chooses are dynamic anyway and not pulled from your meta descriptions. In fact, a recent study conducted by Yoast showed most of the snippets Google shows are not from the meta description, but rather they are from the content on your web pages.

War is on. Amazon vs Google

Brief Notes :
- Amazon stopped selling Google's chromecast
- Google dropped Youtube ads.
- Now Amazon is droppping all Shopping listing ads.




It has been reported that Amazon has stopped bidding on Google’s Product Listing Auctions as of April 28th. Google’s Product Listing Ads are ads with product images that appear at the top of certain search results. Amazon had been a top bidder in that auction for a year and a half. In the meantime, Amazon had been cultivating a competing product. What’s next and how this affects your business.
What is Google’s Shopping Ads Program?

Google’s Shopping Ads program is an enhanced version of standard AdWords advertising. It allows a merchant to upload a feed of their products, including photos. It results in an attractive product listing at the top of Google’s organic search results.

The benefit of a Shopping Ad is higher click through rate (CTR) that Google’s Support Page reports can be as high as 300% better than standard text ads. The leads are better because a user is able to view product information before clicking an ad, so they know what they are clicking to when they click the shopping ad. These ads tend to take up more space in a mobile device, crowding out the organic results.
Is the Competition Between Amazon and Google Intensifying?

I asked PPC expert Jim Stratton, PPC Director at Jason Hennessey Consultingabout Amazon’s possible next move and he said,


“This might be a battle between Amazon and Google where Amazon was trying to gain insights to further advance their own bidding systems and technology.”

Putting on my tin foil hat, I noted that Amazon’s participation may have helped drive the up the cost of Google’s Shopping program, possibly discouraging some advertisers from competing on Google Shopping Ads against Amazon. Jim expressed interest in the idea:


“…strategic bidding to pump up the cost of Google’s PLA and make Amazon’s more attractive is an interesting concept.”
Amazon is Google’s Competitor

Amazon is a formidable competitor for shopping advertising. As YouTube is a search engine for videos, Amazon is where consumers go directly to search for products.


Amazon is not just an online marketplace. Amazon is a shopping search engine. A shopping search engine is the most profitable slice of Google’s pie and that makes Amazon Google’s biggest threat because of Amazon’s PPC program.
Takeaway for PPC and Merchants

Jim Stratton believes this is a moment of opportunity for merchants.


“The opportunity for PPC bidders, with Amazon dropping out, this opens up the space for competitors. Those advertisers who have been historically competing with Amazon will see their ad positions increase, higher click through rate percentages, and increased click volume, which will lead to higher ad spends but proportionally higher sales and profits.

I believe that this will open up the space for new advertisers who have been sitting on the sidelines because they didn’t want to battle Amazons deep pockets for their respective products and categories.”

So it may be that the moment is now for merchants to give Google Shopping a try because now is as good as it’s going to get. Read the full story at Merkleinc.com, Amazon Abruptly Shuts Down Google Shopping Program

Update on Google Algo : Google confirms core search ranking algorithm update - March 2018



Google has confirmed that they ran a “broad core algorithm update” last week that has impacted the appearance and rankings of some websites in the search results.
Google posted on Twitter that Google does these types of updates “several times per year” and there is nothing a site can do specifically to “fix” their ranking after the core update runs. “Some sites may note drops or gains,” Google explained and said if a page drops, it doesn’t necessarily mean there is something wrong with that page, it is just how Google changed their ranking models that now benefits “pages that were previously under-rewarded.”
Here is Google’s statement on Twitter:
Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year.
As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.
There’s no “fix” for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.
There was speculation over the weekend about a Google update; this is Google confirming that speculation.

Search Result Update on Google "People also search for"

I have just found that while searching for some query google starts to display "People also search for" just below to each result once you click on that link and coe back to google result page.


Some thought leaders have also tweeted this :