HDF store: one large file vs. multiple files regarding performance

I have some data (~ 24GB altogether) saved in Microsoft Access files (.mdb). The .mdb files are all split into different categories and time (each month is in an extra file). Making ~40 files.

So I converted all files into csv to not be bothered with proprietary Microsoft files (using pyodbc). At the same time I converted it into hdf files because I was intrigued about this format. At the moment I have a corresponding .h5 file for a .mdb file. I tested the querying speeds with pandas and I quite like it this format.

Now my questions: is it better to save all the content into one big file? Are there any drawbacks regarding querying speed? How do the querying speeds scale with file size? Writing speed is not important at the moment.

Further note: I save via tables (not fixed) in hdf and my index is timestamps data.

Are there any other tips/best practices with this format?