Given billions of rows of data across a few tables, how do you best make sense of it?
My current method involves thinking up interesting questions and writing database queries. I then plug the resulting data into gnuplot to examine it.
Is there generally a better way? I am kinda hoping a Mathematica/Matlab type shell or similar for databases or other data sources exists. Just type a query and view a graph. Even better, type queries, output graphs into a web page.
Or is the method to hire a data scientist to build specialised reports?
The data format is agnostic, interested in how this works across all ecosystems.
Your inquiry won't go anywhere until you describe the problem you're trying to solve. Be specific, if only for a single example problem.
I say this because there's no generic solution to accessing a large database -- the solution depends on the goal.