SOLIDWORKS 2015 – The How & Why of Reduced File Sizes #solidworks #sw2015

It was earlier this year (May 2014) when I first became aware that we would see significant reduction in the size of files (on disk) with the release of SOLIDWORKS 2015.     I had put a out a post,  Why Do I have to Follow the Rules! which had a look at modelling techniques and it’s effect on performance………..   Now that your back after looking at that post!  You can see that I had some discussions with Vajrang Parvate – Director Product Development – Dassault Systemes SolidWorks and at the time he had provided some early information for what we could expect (file wise) with SOLIDWORKS 2015.

It was something that I was particularly keen in see the results for myself.   Choosing a few parts at random from files that I had on hand, it was just a matter of opening the file and Saving with SOLIDWORKS 2015.

Compare Audi 2013-15The Audi model had a file size of 54,060,544 bytes in SolidWorks 2013 which was reduced to a file size of 45,854,205 bytes in SOLIDWORKS 2015.   A reduction of 15.2%

Compare Deck 2011-15The deck model had a file size of 1,110,016 bytes in SolidWorks 2011 which was reduced to a file size of 801,139 in SOLIDWORKS 2015.  A reduction of 27.8%

Compare FilletThe fillet example model had a file size of 626,688 bytes in SolidWorks 2014 which was reduced to a file size of 353,698 in SOLIDWORKS 2015.  A reduction of 43.6%

AL-KO 2014The AL-KO model had a file size of 21,794,816 bytes in SolidWorks 2014 which was reduced to a file size of 8,710,887in SOLIDWORKS 2015.  A reduction of 60%

These are some impressive savings and I’m a little surprised that SOLIDWORKS hasn’t made more of it achievement.  I was curious to how this was achieve and I thought it needed some more investigation.  It was way above my ability to work out so I thought there was only one person to speak to about it.

So I approached Vajrang Parvate – Director Product Development – Dassault Systemes SolidWorks and posed a few questions to him.

(ML)  Question 1.  Why?  Or more precise why now?   Product is doing the job, storage/ hardware is relatively inexpensive, why spend the time now?

[VP] The why: users are creating bigger and bigger assemblies and drawings as time progresses. With SOLIDWORKS 2015, we are now completely leaving the 32-bit architecture behind and it sets the stage for the entire SOLIDWORKS user base to make the leap into ever increasing and complex designs.

The second part of this is that : yes, storage is relatively cheap – but only for individual users with consumer-level disks. Once you step into the world of data archival, enterprise storage, storage replication (such as Enterprise PDM vaults), network transfers and also opening/saving files from network shares, backups, etc.  Every little bit helps where file sizes are concerned.

And why now ? It’s just when everything came together into the product (see the answer to 1-2a. to see the timeline on how project lifecycles work out).

(ML) Question 2.  How?  How do you achieve such reductions?  Is it just the raw data or does it include the likes of metadata as well?

 [VP] Most of the how is a trade secret of SOLIDWORKS. Suffice to say we have been able to accomplish this without compromising any of the functionality of all SOLIDWORKS and related products – except for Previous Release Interoperability and one other minor custom property related feature  in Windows Explorer.

(ML) Question 1-2a.  Another How or Why question.   You may call it the “chicken or the egg” question.   Coming from a manufacturing development background it always interest me how any “new feature” is introduced.   Was there a technological break-through “solution” sitting there waiting for a “problem” to solve?  Or was there just the question “Can we do that” and hard work solved the problem and gave you the answer?

 [VP] Allow me to draw parallels here between software engineering & other kinds of engineering; they are actually very similar.

The evolution of a product leads the product to be used and stressed in unexpected ways beyond the design limits that exposes weaknesses that weren’t seen before, which leads the product designers to beef up that part of the product which then leads it to be pushed by the users in other ways; rinse and repeat.

Users have been making ever complex and large assemblies in SOLIDWORKS ever since we introduced the 64-bit version in SW2006. In parallel, Enterprise PDM as a file vault and archival system has been gaining in popularity by big leaps and bounds. Customers create a lot of their intellectual property in SOLIDWORKS and we have to make sure all workflows that aid in keeping that data safe – backups, network transfers, storage replication, etc. – are performing to the best of their abilities.

We knew a day would come when file size and how we store things on disk would become something that is on the “front burner”.  Just like the design process for any product, we started the investigation several years ago – with proofs of concept, prototypes, assessment of impact on all other products / APIs in stages, etc.

On a similar note, we are currently working on things that users will not see for 2 or 3 years, perhaps never. Yes, just like product engineering, not of our prototypes or proofs or concept pan out.

(ML) Question 3.  I read on the Beta forum about “totally rewriting the code”  Whilst most of us who’s only understanding of code comes from the movies, where we see the coder fueled on alcohol and various caffeinated Red Bull drinks feverously typing away, in reality was this the case?  Did you have someone isolated 22hours a day feverously typing away!  Or I should phase it,  how many people were involved / hours working on this part of the software?

 [VP] I find it very amusing as well when I watch a movie showing the code junkie stereotype. All projects, especially large ones, have some element of research, planning, prototyping, definition, coding, QA and system automation before it becomes visible to users. Yes, there were certainly times in the release cycle when the developers and QA engineers were working really long hours and weekends to meet the code freeze deadlines to make sure we could ship Alpha, Beta 1, Beta 2, etc. on time. Alcohol was definitely not involved, though caffeine most likely was. I couldn’t say how many people or hours were involved in project, but certainly 2-3 or more engineers from all products that we ship were involved in some way at different points in the development cycle.

(ML) Question 4. We know that at this stage there is no backward compatibility with 2015 & 2014 SP5 (not that I personally believe that it is going to affect too many people!)  I guess the question there is,  Is this leading further away from seeing greater backwards compatibility?   That is across more versions?  and full features not just import models?

[VP] The fundamental nature of data saved to disk is that it is a “serialization” of the in-memory objects of that version of the code. A piece of code must be matched to the objects’ version on disk or else bad things will happen. With the large number of features that are available in SOLIDWORKS (and in this context, even things like annotations in Drawings count as features), it becomes virtually impossible to provide feature-level backward compatibility.

This has been the nature of problem since SOLIDWORKS was released in 1995. As engineering trade-offs go, I believe most users would rather us focus on more productivity features, performance and stability in the product…

(ML) Question 5.  I guess if we refer back to Question 1 of Why,  is this leading to further future Development?  (Now switching to hard hitting journalist mode!)  Is this leading to greater compatibility with say other Dassault Systeme SOLIDWORKS products such as ……… SolidWorks Mechanical/ Industrial Conception  Or is it just about performance and file size?

[VP] I will invoke my “no comment” card on this one.

(ML) Question 6.  Is “file size” reduction about “Performance”?    Will it take less time to “Save” “Rebuild” and “Open” files

[VP] It started off as purely being about file size. But during the course of the project we were able to optimize portions of the adjacent code which made it more efficient to read/write in specific cases. E.g. files having extremely large number of configurations. Rebuild of files is a purely CPU-bound operation and unaffected by this project.

Well that was interesting and somewhat educational.  I’m not sure what you can read into Vajrang “no comment”,  A red herring perhaps! or just something to keep a few to continue to speculate.  Either way watch this space!  (Well not particularly this space!)

If you were looking for just one reason to update to SOLIDWORKS 2015 this I believe would have to be close to it!

I have to Thank Vajrang for his time and assistance in providing these answers.   I approached Vajrang around the time of Beta 3 which also coincided with him traveling for training with the reseller.  I’m sure he had more than enough things going on at the time.   I greatly the appreciate the time he spent answering these questions.

7 responses to “SOLIDWORKS 2015 – The How & Why of Reduced File Sizes #solidworks #sw2015”

  1. THANKS Mick and Vaj for a most schmick, interesting, educational and amusing interview !

  2. Very informative. Thanks to both Michael and Vajrang for taking the time.

  3. are there any way to reduce the file size in 2013 itself? My client is using SW2013, I can’t update to 2015!

    1. Unfortunately no! The file size reduction is due to a new written code which is introduced to 2015
      If you are looking to reduce file size so you can transfer files (email) you might try saving in a different format. I find Parasolid x_t normal gives the smallest file size.

  4. […] deal of assistance to a few of my posts over this past year.  My discussion with him for the  SOLIDWORKS 2015 – The How & Why of Reduced File Sizes post is one of my most viewed […]

  5. ok, so we know it’s a compression algorithm included in the open/save routines. Why can’t they say so, and that its purpose is to facilitate the faster transfer of files to cloud applications, storage and sim farms.

  6. […] the Helix and Swept Cut features. However, for most purposes, this is an unnecessary step that adds to the file size and complexity of the part. In most cases, it is sufficient to simply represent the feature using a […]

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.