In a sense, script memory limitations aren't coming to Second Life; they already exist.
What's going on is the process of Linden Lab making those limits predictable, and setting things up in such a way that script memory usage doesn't cause simulator processes to thrash madly (from paging memory to and from disk).
There's some interesting side-effects emerging from the overall prototype implementation, however. Mono (and, eventually C# when or if it becomes implemented as a scripting language) look like the losers.
There's been a lot of public discussion between Linden Lab developers and open source developers and users in the last couple of weeks, and we've more or less boiled down the discussion for you.
Your basic in-world script can be compiled for either the old (deprecated) LSL runtime engine, or the newer shinier Mono runtime engine.
The script will be allocated a certain amount of memory, which includes the compiled code. Given two identical scripts, the one compiled for Mono uses more memory in code than the one compiled for LSL.
Scripts compiled for the LSL runtime are allocated a fixed block of 16K of memory, while scripts compiled for Mono are given 8 times that: 64K of memory, which allows much more data to be stored even accounting for the increased overhead of Mono scripts. Scripts crash if they ever exceed their preallocated storage.
Due to various tricky implementation constraints, there are no plans to enable scripts to be dynamically allocated memory on the fly, limiting them to only the amount that they use, so the amount of memory usage of a script is the amount that is preallocated on startup.
That preallocated amount will be able to be adjusted (though still no more than 16K for LSL or 64K for Mono) with a new function, where the script author can determine how much of that memory the script will use, and set the preallocation amount to just that. As usual, the script will crash if the preallocated amount is exceeded.
Now, this particular implementation of rationing script memory usage all has a few interesting corollaries:
- Given identical scripts, a script that is compiled for the LSL runtime is always the best option for resource usage, unless (for some reason) you're not able to fit all your data into it. The LSL runtime has the lowest amount of compiled bytecode for a given script of source-code.
- Even if the script memory usages are the same and the scripts are the same, the Mono script will have to use more, to accommodate the additional overhead in bytecode and stack. Mono loses.
- If or when C# (also compiling down to Mono) becomes available as a scripting language, that will likely consume more memory resources than an LSL or LSL-Mono script compiled to do the same job. C# loses even more because (at the very least) of translations of event and data models.
- Lastly, once script memory starts being a rationed resource, and the displays of its usage become available to the general Second Life population, all of that will actually make a difference. Scripted objects that use the LSL runtime will intrinsically appear more attractive to purchasers, users and landowners than those that compile down to Mono code.
We contacted Linden Lab last week to make sure we had this all straight, but the Lab declined to provide us with any clarifications or additional information – though we're expecting a blog-post on the topic from the Lab later in the week.
|Are you a part of the most widely-known collaborative virtual environment or keeping a close eye on it? Massively's Second Life coverage keeps you in the loop.|