You have two versions of a website landing page (A & B) and you want to test each one to determine which is the best and produce the most conversions. BTW, A/B testing can also tests different elements like logos, colors, call to actions (buy now!), placement, headline, layout of website, images, amount of text on pages, etc. Think of A/B testing (or split testing) as a methodology that enables marketers to compare a variety of test samples against a baseline control sample (the existing element) to identify and measure which element is most effective in achieving desired outcomes. A/B testing begins with a core baseline (A) and a variation (B) or alternative version. A & B are tested simultaneously to measure which version is more successful.
After you decided what to test, grab a good tool for the project. Google Analytics has a new A/B testing feature called Content Experiments. You can access Content Experiments by logging into your Google Analytics account, opening the profile you want to run an experiment in and click the Standard Reporting tab. In the left menu click Content, then Experiments, then start experimenting.
Tips: When doing A/B testing, never ever wait to test the variation until after you’ve tested the control. Always test both versions simultaneously. Don’t let your gut feeling overrule test results. The winners in A/B tests are often surprising or unintuitive. Know how long to run a test before giving up. Giving up too early can cost you because you may have gotten meaningful results had you waited a little longer. Make your A/B test consistent across the whole website. If you are testing a sign-up button that appears in multiple locations, then a visitor should see the same variation everywhere. Do many A/B tests! An A/B test can have only three outcomes: no result, a negative result or a positive result. The key to optimizing conversion rates is to do a ton of A/B tests, so that all positive results add up to a huge boost in traffic and conversions.