Guest User Guest User

testing the Intent

      'You are the 97 percent guys!!!', sounds weird but I was delighted to hear this when one of the candidates at a job fair came to our booth and told me that he has heard about IntentMedia and wanted to know more about the company. He was also astounded like many others when I explained in detail that what as a company we do. If you are interested to know more about Intent Media visit www.intentmedia.com


            I have been working at IM for last 14 months as a quality analyst. Before IM I have had varied experience as a tester; testing the trading applications, web applications, CRM applications with diff integration points but the testing at IM is challenging. I believe a yearn to do something different as a QA has landed me here. IM like many other startups of NYC has a great culture and amazing set of people to work with but what I like the most is the problem that we are trying to solve here 'What to do with the other 97 percent?'

          We are an e-commerce advertising company. We work with advertisers to show their ads on publishers websites : Orbitz, Expedia and Travelocity. I would not go more into the products or the advertising details, instead would prefer to talk more about what I do at IM. 
         
           In any usual project as a QA we would test the website or the software designed but for us the software designed is ads not the website where the ads are shown. Hence the quality assurance would involve ensuring that our ads appear correctly on publisher websites. Now that would be the case for all the online advertising companies but where we are different is we need to match our ads with the publisher websites. Here the complexity of the system increases. Since we show ads on multiple websites we cannot have the same ads across the board, hence all the ads are designed and maintained with changing publisher websites.  I have always heard that in Agile projects also QAs complain that requirements change too often and its difficult to keep track of new functionality. At IM since we don't have control over different sites we have to be very flexible in what and how we do things. 
  • We initially started with mocking the publisher sites to test the ads but that never gave complete and true feedback. To solve this problem we wrote simple tools that would enable the production ad call requests routed to our local servers. By doing this we could see our ads served from development environment on publisher's production website locally. Which would give a true picture that how ads would look like once released to production.
  • Considering the underlying fact of web testing that there are multiple browsers used, cross browser testing is imperative.Then the tools were extended to support ad serving on Virtual machines to test on IE. With almost 5 different browsers and 4-5 publisher sites to be tested the manual testing task is definitely huge. Manual testing of new features cannot be ignored but for regression it would not be very productive and automating every thing is also not a viable option. We instead opted for system monitoring tests and capturing screenshots from live sites after our daily deployments to assure that things are up and running.
  • Being an advertising company we do lot of AB testing and Multivariate testing around our ads and which adds more load to our manual testing efforts. This is one of the main reason we don't automate everything. With so many variables automation tests are not very effective, given that those features will change again after a short period of time.
  • We do have our admin website through which we manage advertiser and publisher accounts. It is also used for accounting and billing purpose. The regression testing of this site is taken care by our integrations tests written in Cucumber + capybara + WebDriver  running on TeamCity.
  • We have been deploying changes  to production every day since last 8 months and for this we need to make sure that our master code branch is always deploy ready. Initially all the changes were directly merged to our master branch and then the changes were tested both automatically and manually. Any issue introduced with a new change could possibly hold the production release. Maintaining multiple version of the same master branch was also considered a bad idea. We then started using git pull requests which enabled the developers to review the code and QAs can test the changes before merging to the master. Any failing tests would not make the master build red and can be fixed earlier without the need to run the entire tests suite locally. This approach has reduced the deploy risks and production issues significantly.
        As I said before we are a growing company and have testing related challenges that we are trying to deal with while exploring smart innovative solutions. Few worth mentioning, which I believe are faced by many teams are: Java script testing, brittle and flakey tests, long build times and definitely IE testing as always.
     
Read More