[LRUG] A/B testing talk - tools followup

Simon Coffey simon at urbanautomaton.com
Fri Dec 16 02:35:31 PST 2016


Thanks very much for listening to my A/B testing talk on Monday.

I said I'd follow up to the list with some of the tools we used for the
experiment, logging and analysis.

*Running the test*

The A/B testing gem we use is called vanity <http://github.com/assaf/vanity>,
and the other main (and slightly more widely-used) gem for rails apps is
split <http://github.com/splitrb/split>. We've lightly modified vanity to
support logging the A/B test assignments - at the moment this code is
pretty specific to our app, though.

The gem we use for event logging is ahoy <https://github.com/ankane/ahoy> (gem
name: ahoy_matey). It's great, and I would immediately add it or something
like it to every app I work on in future. (See also: lograge
<https://github.com/roidrage/lograge>, which
one-json-blob-per-request-ifies your rails logs.)

The logs are aggregated using fluentd <http://www.fluentd.org/> and stored
in S3 (as well as being sent to loggly for live querying). We're now also
importing them into Amazon Redshift

*Test analysis*

After downloading the logs for the relevant date range we wrote a makefile
that used:

   - jq <https://stedolan.github.io/jq/> to filter the events and flatten
   them into a single csv per event type with the important fields
   - sqlite3 <https://sqlite.org/> to import these CSVs as tables and do
   the joins required to match the events, emitting a one-line-per-participant
   csv with the conversion info
   - wizard <http://www.wizardmac.com/> to pivot the resulting csv into the
   final tallies (it also lets you create fancier models for your results if
   you're inclined to, e.g. controlling for variables)

jq in particular is a great tool if you have a stream of JSON objects to
process. Wizard is nice too (well worth the price I think) and lets me
pretend to be a data scientist occasionally, with only minimal grumbling
from the real experts.

I mentioned Redshift - this lets us put our events in a big postgres
instance, replacing the download + jq + sqlite3 steps above with a single
SQL query. This is pretty cool IMO, not least because downloading gigabytes
of logfiles over office wifi tends to make your colleagues cranky.

Anyway, enough from me about logs. See you in January. :-)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.lrug.org/pipermail/chat-lrug.org/attachments/20161216/c0037859/attachment.html>

More information about the Chat mailing list