So I have a database that contains logs of people clicking a link. What I store is id
, country
, referrer-domain
, clickedat
. That last column is a timestamp.
At the moment there are not a lot of rows, but if this takes off it might be tens to hundreds of thousands of rows. What is the best way to query the database for things like this:
Using a COUNT(*)
would eventually be way too slow. I've seen techniques where you add another query to every update, insert, delete that happens to save in a special aggregation table. But I'm not sure that would work, as I'd like to have users be able to select two specific dates for example. Or I'd have to aggregate by day.
If you add an indexed Date, column, so you're not doing date/time calculations on the fly, you should just be able to query it using normal aggregations. It would take a long time before that would be 'way too slow' with properly formatted queries.
If it takes off, look into de-normalising the data as you describe, but don't optimise it prematurely!
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments