POV-Ray : Newsgroups : povray.off-topic : C++ / database question : C++ / database question Server Time
6 Sep 2024 05:15:47 EDT (-0400)
  C++ / database question  
From: scott
Date: 16 Feb 2009 05:07:13
Message: <49993ad1@news.povray.org>
I am planning to write a C++ windows-based application to retrieve and 
analyse some data.  I imagine my data to be stored in some sort of database 
table, with each row having perhaps 10 pieces of data (some text some 
integer) plus a unique ID (this is generated for me).  I expect absolute 
maximum 1000 rows of data to be added per day, so after a year or so we're 
talking a few hundred thousand rows in this "table", not more than a 
million.

The type of analysis I will need to do will be fairly straightforward, like 
taking averages and sums of each bit of data over all (or subsets of) rows, 
and maybe even some simple filtering but nothing fancy.

My question is, should I be looking to use some external database engine to 
do the backend work here for me, or can I get away with just using STL 
containers like "set" or something, with a struct that holds my data?

Anything else I should think about before deciding which way to go?

If a database engine will be better, any recommendations of which one?  An 
easy to install/learn one would be better, and it must be free to distribute 
with my program and of course easily accessible from C++.  I know a *tiny* 
bit of SQL but I think I could learn what I need quite quickly.

And if you think I can just do it in C++ without using an engine, is using 
"set" from STL the best way?  What is a practical limit to the size (in MB) 
of a set I should be manipulating/loading/saving.  I'm thinking even a 
million rows times eg 256 bytes should be fine for an application nowadays 
to load/save and work with in memory?

Any other thoughts?


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.