[LRUG] Using statistical tables with Rails

Gerhard Lazu gerhard at lazu.co.uk
Fri Sep 23 04:07:48 PDT 2011


Richard, I've tried the pure ruby driver approach for mongo, it requires a
lot of work if you want to keep it pretty. Depends what you're doing, but
even with 3-4 collections and simple relationships, an ORM makes life easy
with little effort.

Mongoid with some sensible decisions worked best for us, only about 20%
slower than using the driver directly (I went crazy on the benchmarks). If
you write the app using EM (em pool the connections), you'll hardly notice
the 20% difference.

This is my 2nd large-ish production mongo & redis pure MRI app, I've been
both places : ).

Gerhard


On Fri, Sep 23, 2011 at 11:55 AM, Richard Livsey <richard at livsey.org> wrote:

>  I'd recommend taking a look at mongodb [1], your use-case sounds like what
> it was made for and mongo is really easy to get up and running to play with.
>
> Because it's document oriented ('schema-less') you've got a lot of
> flexibility on how you structure the data you want to store. You're not
> stuck with a table with tonnes of NULL fields because not every event
> contains the same information for example.
>
> It's very good at throwing large amounts of event/log style information
> into, whilst at the same time has atomic modifiers so you can build up
> aggregates as you go. You still get indexes and the ability to run ad-hoc
> queries, so you can just start by dumping data in there and work on the
> aggregate counters over time.
>
> The raw Ruby driver is really nice and you don't need to use an ORM for
> most tasks, if you have a more complex model then it's worth looking at
> MongoMapper [2] or Mongoid [3].
>
> Hope that helps!
>
> [1] - http://mongodb.org
> [2] - http://mongomapper.com
> [3] - http://mongoid.org
>
> --
> Richard Livsey
> Co-Founder, MinuteBase
> Meeting collaboration made easy
> http://minutebase.com
> +44 (0) 7841 260 797
>
> On Friday, 23 September 2011 at 11:37, Neil Middleton wrote:
>
> > I'm building an app that needs to store a fair amount of events that the
> users carry out. (Think LOTS as in millions per month).
> >
> > I need to report on the these events (total of type x in the last month,
> etc) and need something resilient and fast.
> >
> > I've toyed with Redis etc to store aggregates of the data, but this could
> just mean that I'm building up a massive store of single figure aggregates
> that aren't rebuildable.
> >
> > Whilst this isn't a bad solution, I'm looking at storing the raw event
> data in tables that I can then query on a needs basis, and potentially
> generate aggregate counters on a periodic basis. This would thus give me the
> ability to add counters over time, and also carry out ad-hoc inspections on
> what is going on, something which aggregates don't allow.
> >
> > Question is, how is best to do this? I obviously don't want to have to
> create a model for each table (which is what Rails would prefer), so do I
> just create the tables and interact with raw SQL on a needs basis, or is
> there some other choice for dealing with this sort of data?
> >
> > It would be interesting to know what thoughts you guys have.
> >
> > Cheers
> >
> > Neil
> >
> > _______________________________________________
> > Chat mailing list
> > Chat at lists.lrug.org (mailto:Chat at lists.lrug.org)
> > http://lists.lrug.org/listinfo.cgi/chat-lrug.org
>
>
> _______________________________________________
> Chat mailing list
> Chat at lists.lrug.org
> http://lists.lrug.org/listinfo.cgi/chat-lrug.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.lrug.org/pipermail/chat-lrug.org/attachments/20110923/ad202703/attachment-0003.html>


More information about the Chat mailing list