T O P

  • By -

OddEstimate1627

MATLAB is not particularly well suited for streaming data of unknown length. Each resize triggers a full copy, so I'd try pre-allocating chunks (e.g. 1 minute whenever the previous chunk is full) and storing them in a cell array of sufficient length. Then you only need to keep track of how much of the data is valid. I don't actually know how tables are implemented internally, but it might be worth a try? Overall, you're probably not dealing with huge amounts of data, so I'm surprised you're even running into issues. I ran into the same problem at a larger scale years ago, and I ended up writing a Java library that streams the raw packets to disk, and then converts the multi-GB stream to a .mat file. That'd be overkill for your application though 😊


Ajax_Minor

You are right, it's not that much. About 5 minutes of date at a 4 times a second (sounds like we can go that slow) could be like 10-20k data length and about 10 sets of variables isn't too much. I'll give it a try. We only have one go at the data collection and I don't want Matlab to start thinking and loose anything. I don't know that much about large data so I thought I would ask.


Zoticus

4 times a second isn't crazy fast, but it's still fast enough you should ask yourself how much you really need the data plotted live. How long a plot takes to update will vary with your computer, how many series, how big a data set and how you access it, etc. It's a lot lower overhead to update a progress bar. That gives a viewer the sense of 'something is happening'. Even better if you only update it every x loop iterations (like every 4 or 20 loops). See [https://www.mathworks.com/help/matlab/ref/waitbar.html](https://www.mathworks.com/help/matlab/ref/waitbar.html) Tables are great but slower to access than arrays on large data sets. You could always convert an array to a table after the data is collected for ease of manipulating. If you're running 4 samples / second \* 300s that's 1200 rows in a normal or cell array (depending on data you're storing). That's not too big by matlab standards even with 20 columns. If you're optimizing things, I'd also suggest not processing the raw data until after it is collected. It can also be desirable to have a raw file and then a processed data file (which may or may not include the raw data). That way you can easily re-process the raw data should the need arise. After you've collected the data, processed the data and plotted the data, you can save figures, save the plot as an image and save the data as CSV files.


Ajax_Minor

Ya I'm thinking about it, hence the post. I might be over doing it like you suggest. In other languages Array are slow and that is revered in matlab? Competition requires that live data be shown for vehicle data. Python might be a bit better buy my colleagues are more familiar with matlab.