mongoose
Contributor
OK, I've been toying around with this on my computer, trying to calculate an average value for a set of readings over time.
MY QUESTION: Imagining a dot-to-dot line through all the data points, and that being a "curve", if I take the area under that curve, and divide it by the amount of time (x-axis) over which I am measuring, it seems to yield the average value of all my readings over that span of time.
This seems too easy, is my algorithm flawed?
Thanks,
--'Goose
MY QUESTION: Imagining a dot-to-dot line through all the data points, and that being a "curve", if I take the area under that curve, and divide it by the amount of time (x-axis) over which I am measuring, it seems to yield the average value of all my readings over that span of time.
This seems too easy, is my algorithm flawed?
Thanks,
--'Goose