Analytics Support to Web-based Marketing Campaign

* Management of Google Adword program

AirTight has the business in the area having very low or no search volume. But many areas relate to AirTight®’s business. Earlier AirTight was bidding on the keywords but not getting the traction. Keywords were inactive in spite of high bid, there were very less impressions/clicks and bounce rate was high. I analyzed the state based on with help of Google Adword help and realized that we need to put Ad on the keywords having traffic and then escort the person to the solutions offered by AirTight by providing the missing piece of information. Here I suggested a 3 page model. The first page is talking about exactly what the user was looking for and offers what the Ad offers to the user. The second page describes, how the area in which AirTight works is related to the term the user was looking for. Third page is the AirTight’s solution page that gives the marketing pitch.

* Design of tools for analysis of Google Search results and placement of Google Ads

The idea was to understand where AirTight and its competitor stand in Google and yahoo search results when a user searches for a related keyword. This info helps in 2 ways: 1. AirTight can stop bidding on the keyword for which it is in the top of search results. 2. Apply focused energy where AirTight is lacking. I designed a combination of a bash script and a web application in php to record the position information from search result at a given time. After collecting the data over a period of time, the data can be plotted as graph to analyze the trend.

Another idea was to find out the places where AirTight should put Ads on content network. The objective of this tool was to find out which websites that show-up high in the search results of Google and Yahoo have the Ads by Google on the landing page.

* Design of tools for analysis of click-stream data coming from website

AirTight was using Stacounter to collect activities of the website-visitors. Earlier marketing team was using Statcounter’s web interface for analyzing the data and figuring out the potential leads. This was a time and effort consuming task and the person analyzing was getting limited time info about the users. I designed a tool to put the raw data into a in-house database so that the info of all the visits remains recorded and we can do better analysis. First phase of analysis-automation was to provide info about the organization from where the visitor came along with all the historical activities of a visitor. Second phase is to do the profiling of visits and qualify a visitor as a potential lead based on the pages visited by him. Other extensions of this tool includes understanding the intention and interest of an organization even if the visits are from different IP address and identifying the pages on the website that should be optimized/better written.

* SE optimization of corporate website

Suggested changes in web pages to make them web page friendly. Most of the knowledge came from Google help. This activity includes changing title, meta keywords, meta description, headings, alt text of images and content in a way such that they remain focus on a single topic and come in the top of search results.

* Creating strategies for online marketing (awareness generation)

Suggested different methods of making the technology and the product popular in the global marketplace using web2.0 tools such as blogs, forums, social bookmarking, feeds etc.



This area of work involved:

  1. Gathering relevant data about

· Organic search results leading to relevant pages

· Google Ad placement and ROI

· Page visit analysis

  1. Analysis and recommendations for

· Content modifications for improved organic search outcome

· Google Ad placement decisions (considering location of Ad and ROI)

· Content reorganization to improve visit time and pages viewed

· Content modification to ensure clear relationship between Ad content and page content

  1. Development of analysis tools for

· Gathering click-stream data from multiple sources like Google Analytics, Statcounter etc.

· Filtering of data to gather relevant information

· Visualizing search results to help make right recommendations

Challenges:

  1. Ad placement decision for ROI maximization (in presence of low search volume)
  2. Non-intuitive content reorganization based on search volume and page visit analysis

Technologies and Tools:

  • Google Analytics, Statcounter, Market2Lead, Google Adwords, Google Trends, Google Keyword Tool, Google Traffic Estimator, Google Advance Search, Yahoo Site Explorer, Alexa, Technorati, Google Webmaster’s Tools, Keyword Density Tool, HTTP header viewer
  • HTML, PHP, ASP, Javascript, CSS, Linux shell scripting, awk, sed
  • Content Management System (Typo3)

Role and Responsibilities:

  • Collect data about website from different sources and send alerts to top management
  • Identify improvement areas and present them (with solutions) to the US marketing team
  • Coordinate with marketing, web development, technical writing and R&D teams to implement the changes recommended


2 comments:

  1. Alexa’s crawler is one of the biggest in the Web world. By 2005 Alexa's crawler was hitting 4 to 5 billion pages a month and archiving 1 terabyte of data a day. Today this figure has changed. Now Alexa archives almost 1.6 terabytes of data per day, and hits a total of 4.5 billion pages from over 16 million sites. In an attempt to open up a whole new market Alexa made an announcement that its crawler would be open to up its crawler to requests from the floor to anyone willing to pay.

    The programmatic access to Alexa's web search engine was enabled by the Alexa Web Search Platform. However, this platform also tried to start a search revolution. But after a while this platform was also taken over by the Amazon Web Services. Alexa also has a directory which is very similar to DMOZ, the open directory project. The directory has the ability to sort categories by Popularity and Average Review rating. Through the directory one can also access the necessary and related information of websites just by clicking on the title of the site. The Alexa technology was also incorporated into Internet Explorer 5 in 1999. In this way Microsoft also has its share in making Alexa the web giant that it is today.

    ReplyDelete