Topic: Linker Problems in Debug Mode
Arty Topic Opener |
Posted at: 2018-11-05, 22:38
I am working on Windows 7 (64 bit) and use msys64 to build widelands. This works fine for release builds, but I have trouble to finish any debug builds. Everything works fine except linking most of the executables. If I use cmake and ninja as usual then I'll three ld.exe processes, all accumulating memory. A minute or so later each of them hogs 2 - 2.5 GB (depeding on what else I have running), then Windows starts compaining that the processes use too much memory (I "only" have 8 GB total), then the processes start to use a little bit less (does WIndows force them?), and a couple of seconds later the linking fails:
First question: Why are the three linking steps even done at the same time? And can we prevent that via cmake files? Linking them manually one by one seems to work. Or at least linking widelands.exe worked, I didn't try the others. Then it was only one ld.exe process which stayed relatively stable at about 2.5 GB memory usage, and after a couple of minutes the linking successfully finished. Is it normal that the linker requires that much memory for widelands? Is it normal that the debug version of widelands.exe is 680 MB in size? Maybe this is all normal and I am just not used to big projects, but this seems a bit excessive to me. And if this is not normal: Has anyone else had those issues? Any ideas how to overcome them? Top Quote |
stonerl |
Posted at: 2018-11-05, 22:53
It seems your simply running out of memory. Three jobs are executed because you have a quadcore-cpu. It's 3 jobs because of this addition in line 202 in the compile.sh script.
From what I can tell, this size (680 MB) seems to be right, for the debug version. Top Quote |
einstein13 |
Posted at: 2018-11-05, 22:54
I remember that before AppVeyor age, Tino provided debug builds on demand and the exe file was about 130 MB. So your 680 MB is a bit too much. EDIT: maybe something changes in last 2 years? I may be wrong here. Are you demanding windows builds? If you are changing Widelands code by yourself, probably that is the only way, but if you want to have just up-to-date version, you can try AppVeyor: And if you have to build something (f.e. branch), you can also try building under Linux. If I have to build, I use virtual machine (VmWare) of Ubuntu, where compiling Widelands takes some time, but has no errors and with my poor knowledege about C++, Widelands code and Linux commands is still possible to reach . Edited: 2018-11-05, 22:55
einstein13 |
stonerl |
Posted at: 2018-11-05, 22:56
Tinos build did not contain any music files. Top Quote |
Arty Topic Opener |
Posted at: 2018-11-05, 23:10
Thanks for the quick answers. Yes, I was actually aware that it tried three processes at once due to multiple cores, I basically was just wondering why this is the default if linking is such a memory hogger. (I mean, I am aware that my 8 GB memory isn't exactly a lot nowadays, I just had assumed that it should be more than sufficient for linking.) But good to know, where I can change this in the compile script. Thanks. I have started contributing to Widelands as developer, so I am building on my own machine. I usually don't even make debug builds if I can avoid it, but occasionally it might be necessary, so it's good to know how to do this easily despite having lowish memory available. As for Linux....I must admit that I haven't used Linux in 15+ years and have become somewhat complacent with Windows. Maybe I'll set up Linux again. Top Quote |
Tino |
Posted at: 2018-11-06, 07:37
He is talking about the exe file size, not the installer file size. Arty: You are completely fine, i think my debug exe's were 600-800MB. I just stripped them before including them. Yes, i stripped both debug and release builds... And i think also, that our build/link process is atm a total mess due to circular(?) dependencies. I ran into the same problems on Appveyor (only 3,75GB RAM), so there i reduced the memory usage by (see appveyor.yml):
So this does not build the website tools (one linking job less), turns off the ASAN check, defines a job pool with the size of 1 and use it only for linking. So compiling still does use every core available, but linking is not parallelized. I am using ninja, but cmake should map this fine to options for make, too. Also you can disable building the tests, reducing the time for building by half. Our debug builds build in ~1 hour an Appveyor, release builds in ~30 minutes. The additional 30 minutes are only linking time for all the executables... Edited: 2018-11-06, 07:41
Top Quote |
GunChleoc |
Posted at: 2018-11-06, 19:51
I have been trying to attack those circular dependencies, but it's a huge and convoluted task. So, I have only made a small dent in it and it will take a couple of years at the rate that I'm going. For some reason debug builds under Windows are about twice the size of debug builds under Linux Busy indexing nil values Top Quote |
stdh |
Posted at: 2018-11-19, 21:07
Those CMake options Tino gave are quite interesting, with them I can compile Widelands without swapping. Maybe those of us with a modest amount of RAM would appreciate it if they were set as defaults? I don't know if parallel linking gains much time on a non-modest computer - can someone confirm or deny this? Top Quote |
GunChleoc |
Posted at: 2018-11-20, 09:04
We have added a "-j" option to the compile script, so you can tell the compiler to use only 1 thread there. You could also compile with the default setting, abort it when it starts linking the big libraries and call it again with Busy indexing nil values Top Quote |