EldoS | Feel safer!

Software components for data protection, secure storage and transfer

How do I test for evaluation license

Also by EldoS: CallbackDisk
Create virtual disks backed by memory or custom location, expose disk images as disks and more.
Posted: 11/04/2008 14:53:42
by DRichard Richard (Basic support level)
Joined: 11/04/2008
Posts: 4

Im a registered user of the SolFS tools
We are experiencing low performance on reading operation on SolFS archive. It is said on the web site that evaluation version is voluntarily decreasinf the performance of the tool.

I have tried to:
1) Change the registration key used.
2) No register the key provided at all.

In both case I was not able to measure any change in opperation speed, which leads me to the conclusion I may not be properly registered in first instance.

Is there a bullet-proof way to ensure that Im properly registered. (Im do not receive a popu message or any other indication...)
Posted: 11/04/2008 15:32:19
by Eugene Mayevski (EldoS Corp.)

There's no way to check if the key has been set correctly. But lack of speed difference means that inefficiency of your settings is quite significant and it "hides" the speed delays of the evaluation version. If you describe your usage scenario (how many files of what size are added/deleted/read/written) we will suggest you some ways to optimize operations.

Sincerely yours
Eugene Mayevski
Posted: 11/04/2008 15:53:47
by DRichard Richard (Basic support level)
Joined: 11/04/2008
Posts: 4

The main choke point is when we try to access large file contained in the archive. We are using file commonly not under 250 MB, but that can be as big as 20 GB.

We have a collection of small contained file in separate folder, and ussully one (or few) large contained file.

To measure the performance on the tool, Im using a single file with a single contained 240 MB file.

We have experimented with different Buffering, but have found the best value being 1.
We have experimented with different PageSize and have found that the optimal value is 32K since we are reading large chunk of data sequentially, before jumping to new location. Decreasing the pagesize value does not help us.

Of course encryption and compression are not used.

Some metrics:
Archive file(A, 255 MB) contain a contained file (C, 240 MB)
on the test computer, I can do a binary read of (A)and write it back on disk in in 8 seconds.
Using the SolFS Explorer, I can extract the contained file © to disk in 17 seconds.
Another bit of interesting information, we are able to generate data in memory and persist them to SolFSStream at twice the rate we can read from that same stream.

This is symptomatic of the performance I try to improve.

If we go to the fine granular details, we have found that the performance choke point is situated on a call to:
_sourceStream.Read(buffer, 0, dataSize);

where _sourceStream is a SolFSStream.
The buffer is already allocated and reused for each subsequent call to the read method

Posted: 11/05/2008 01:33:21
by Alexander Plas (EldoS Corp.)


Can you provide us storage file with your data files or/and some testing code?
Posted: 11/05/2008 15:45:40
by DRichard Richard (Basic support level)
Joined: 11/04/2008
Posts: 4

It took me some times to build a small sample project to demonstrate my points, sorry.

I will just need a way to send my zip file to you. I tried to embed it in the forum post and it does not work. I am not able to acces your e-mail address too for some unknown reason.

Procedure: The sample file I provided is quite large.
Please copy the "sample.data"file contained file to c:\
Please also make a copy of the same file to the same location but with name "sample.data2"

(This later copy is to ensure that we can jump from one file to the other, so that Windows caching does not impact our performance test conclusion.

The provided source code will:
1)open the first Master file with SolFS
2)Open the fisrt contained file (situated at root level to simplify code)
3)Open a second contained file at same location
4) Read block of data from first file, interspaced with smaller read of the second file.
5) Open the Second Master file using standard dotNet stream
6) PErform the same kind of reading patterns for comparative purpose.

Conclusion, depending of cache usage, I get between 3 to 10 time better performance using standard dotNet stream than going through SolFS system.
Obviously that is not an option for our application, which handle lots of files, but illustrate the cost of performance that I would like to alleviate.
Posted: 11/06/2008 08:14:09
by DRichard Richard (Basic support level)
Joined: 11/04/2008
Posts: 4

I managed to zip my source code and post it, but Im still not able to package my sample data file with it. It is simply to big.

I have tested some configuration with different data file and get similar results.
Please create yourself a SlfFS storage (Using SolFS explorer) called Sample.data

In this file, at the root create two countained file of 240 Mb. one named Cycles.bin and the other called Headers.bin.

That will create you a simplified file which can be used with the source code to test the behaviour reported.

(I also removed the SolFS library from the zip file, the size limit of the post was limiting that too.)

[ Download ]
Posted: 11/07/2008 10:07:06
by Eugene Mayevski (EldoS Corp.)

Moved to HelpDesk.

Sincerely yours
Eugene Mayevski
Posted: 12/15/2008 14:13:25
by Christian Falardeau (Standard support level)
Joined: 02/07/2007
Posts: 17


I will continue to monitor this case for the user DRichard Richard. He is my colleague and we are working together to optmize file reading to display volumetric data.

Since it is been moved to Help Desk we didn't hear anything about this problem. I want to have an update about it if you have any.

We've completed the code sample that was provided to you. One of our modification was to create a class that inherits from your original SolFS stream class and implement inside another cache (fixed size of 64K bytes) over the original one. We were able to double the performance of reading operations. Sincerly we don't understand how this could happen.

When we read a 500MB file:

-With our cache:

-Reading by chunks of 50 bytes (sometimes we cannot do otherwise) we obtain a
performance degradation of at least 30% (~90 Mb/sec) transfert rate.
-Reading by chunks of 1500 bytes we obtain ~120 MB/sec transfert rate.

-Without our cache (pure SolFS stream only):

-Reading by chunks of 50 bytes, we obtain ~1 MB/sec of transfert rate.
-Reading by chunks of 1500 bytes, we obtain ~40MB/sec of transfert rate.

-If we read the same SolFS file with the FileStream class provided by .NET
instead we obtain ~90 MB/sec of transfert rate.

We would expect to have a maximum of 10%-15% performance loss using SolFS storage comparing with the .NET FileStream class.

We've been also very suprised that the method Length and Position on the SolFS stream class take a lot of processing time during profiling sessions. During those profiling session, we have noticed that a lot of processing time is consumed by the page loading methods used internally to maintain the cache.

I think it would be nice to have a special mode when reading file to accelerate the performance. In our case, if a file contains acquired data from a device, after the acquisition is terminated, the file cannot change anymore. So may be some optimization could be done to increase the reading performance.

If you want I can provide to you the updated sample of code. Also if you can setup a ftp link, I can send you a typical data file we acquire during inspection.

I would like to know what you think about all the information I wrote in this post and tell you that is very important to us to reach the performance of .NET file stream class and increase the performance of reading small chunks < 200 byes.

Thank you!
Posted: 12/16/2008 00:45:15
by Eugene Mayevski (EldoS Corp.)

cfalardeau wrote:
Since it is been moved to Help Desk we didn't hear anything about this problem. I want to have an update about it if you have any.

Please ask your colleague to check the helpdesk ticket (http://www.eldos.com/support/ticket_edit.php?ID=14334). He didn't reply to the request and the ticket was closed automatically.

As for your measurements, - let's continue in HelpDesk please.

Sincerely yours
Eugene Mayevski
Posted: 12/16/2008 08:23:09
by Eugene Mayevski (EldoS Corp.)

BTW your colleague can also assign the license ticket to his account to be recognized as a registered user. This will speed up identification of the license and processing of the requests.

Sincerely yours
Eugene Mayevski
Also by EldoS: MsgConnect
Cross-platform protocol-independent communication framework for building peer-to-peer and client-server applications and middleware components.



Topic viewed 9664 times

Number of guests: 1, registered members: 0, in total hidden: 0


Back to top

As of July 15, 2016 EldoS Corporation will operate as a division of /n software inc. For more information, please read the announcement.

Got it!