Sunday, February 21, 2010

And The Answer Is...

42, of course!

When I started this blog, I promised insight into Krakatoa, MAXScript, Life, The Universe and Everything.
Well, there is no better time for that than this year. 4 days ago I turned 42, and for anyone who holds the Hitch Hiker's Guide To The Galaxy dear to his heart, this age sounds quite cool.

Except it probably isn't.

You see, the other day my parents told me they read or saw an interview with James Cameron and he mentioned he was still feeling like an 8 years old. Oh my, I told them, so I am older than The King Of Pandora!? I have always felt like 16, except when I was actually 16 when I felt like 20. Crazy eh?

Anyway, I don't think I will have more answers about Life, the Universe etc., but I can surely talk about the few things that I know about. For example we are beta-testing Krakatoa v1.5.2, so I might mention some things about it.

Probably the coolest thing that happened to this version (quite by accident) was the speed up that it provides to PRT Volume particle generation. We were working on something completely different (which won't be in 1.5.2, but I guess you will all get sooner than later) and discovered that our memory management wasn't working as expected. Normally, when loading particles from most sources like PFlow, TP, PRT sequences, Mesh Vertices etc., Krakatoa not only gets the particles, but it gets the actual particle count long before it has loaded a single particle. All these sources are able to tell Krakatoa "I am going to give you N particles" and Krakatoa can collect these numbers and reserve memory for all of them before it starts loading their data. (This is the case in the 64 bit build, in 32 bit half of the time it might not be able to allocate enough to fit them all in... Switch to 64 bit if you haven't yet!)

There are a couple exceptions to this - CSV files do not provide even a hint about the actual particle count of the incoming particles - the only way to find out would be to actually read them all, and since reading text files is slow enough, we just do it once and allocate memory as we go. Currently, the same applies to PRT Volumes. The PRT Volume could theoretically estimate the final count based on the mesh, voxel size etc., but it is a bit complicated with all the Shell options and testing against the mesh volume, so right now it does not report any expected count to Krakatoa.

The result of loading PRT Volumes in v1.5.0 and 1.5.1 was exponential slowdown as the particle count went up. We realized that this was due to many (and slow) memory allocations and decided to allow the user to allocate enough memory for a given amount of particles via some manual controls in these cases where Krakatoa does not know the final count. The result was quite mind-blowing. Creating 50 million particles with a PRT Volume went from 11+ minutes down to 17.5 seconds which is 38 times faster. When I reported this to our boss, he asked jokingly "Why not 40x?". So I tested 100MP and that was 92 times faster, so I guess everybody should be happy now. After closer inspection we realized that the memory allocation logic in the previous versions had some flaws, so we are working on fixing them in hopes to be able to get the same speedup without manual memory allocation controls.



Another quite positive development that has been on my Wishlist for a while is the addition of a Particle Input Stream. This allows anyone with basic MAXScript knowledge to open the render-time particle stream of a PRT Loader or PRT Volume and read particle data from it. For those of us with advanced MAXScript knowlegde, it meant the introduction of the Krakatoa Particle Data Viewer utility which can be used to peek into the particle data of a scene object. It lets you select a range of data to display, filter that data by any combination of search criteria, even filter by the value of the Selection channel which could be set by a KCM, thus allowing for filtering by arbitrary MagmaFlow logic! Finally, selecting one or more rows in the editor will highlight the corresponding particles in the viewport!

In MagmaFlow, we added Quaternion operators, the missing and quite important Natural Logarithm operator and fixed some bugs in the handling of nested BlackOps. Flows will now save a material library with any maps used by TextureMap Input nodes and will load them back when loading the flow. TextureMap Inputs can now show a preview of the selected map inside the MagmaFlow command panel. Connections to scene object will be restored when loading the flow if a scene object with the same name exists. Object references using the $ path syntax inside Script Input nodes are now supported for interactive updates. We have some more quite sexy operators in the works, but they might have to wait for the next major release, so I will stop here.
Related to KCMs, we are introducing two new MacroScripts - one for saving a modifier stack to disk, one for loading it from disk. This lets you create a whole bunch of related (or not so related) KCMs on an object, save them to disk together with any other modifiers found on the stack like deformers, Krakatoa Skin Wrap etc, then select one or more objects in the same or in a completely different scene and apply the same modifiers to them.

For the few people using the Presets and History dialog (according to our surveys, most users are confused by it, but I am unsure how to improve it, so I am open to suggestions!), we added two vertical icon strips for faster browsing of records by image. You can now just scroll the image strips and load presets or browse thumbnails, saved preview images or settings much more intuitively.

If you read my blogs, you probably remember my rant about how people don't use the MacroScripts with those colorful incons we provided? Well, we not only replaced the icons with slightly better ones, we also added a Krakatoa menu to the Main Menu bar of 3ds Max (which can be turned off in Preferences) so you can now access all tools and settings from there without having to customize a toolbar.

Initially, we wanted v1.5.2 to be a pure bug fix release, so we squashed a whole lot of bugs - you can read their obituaries here. At this point, we are in the same situation as with 1.5.0 which could have been called 2.0.0 without any problem - 1.5.2 could easily be called 1.6.0 since it brings quite a few new features too.

As mentioned, Krakatoa 1.5.2 is currently in Beta. If you have a commercial license and you want to test it but have no access to the Beta forum, let us know. If you don't have a license, you will have to wait a few more weeks...

Sunday, January 3, 2010

A Krakatoa Year In Retrospection

Here we are, finally in 2010 - we had 3ds Max 2010 for almost a year now so I am quite used to the number already. The past 2009 was a Good Year for Krakatoa customers (at least I hope so) due to the release of v1.5 and all that came with it.
Our company worked under the name Frantic Films VFX on a movie hopefully nobody saw, on another one under the name Prime Focus VFX that many people saw, on a third one that nearly every teenage girl saw and on the movie everybody seems to have seen at least three times since it just made a billion today. And all four of them used Krakatoa in their production. There was a fifth movie that had tons of Krakatoa in it, but it is still not released so I cannot talk about it, and after Avatar it is difficult to be excited about other movies anyway :)

So the beginning of the year found us developing Dragonball:Evolution and G.I.Joe at the same time, the former handled mostly in Winnipeg with animation done in Vancouver, and the latter done mostly in L.A. with animation coming from Vancouver and some effects done in Winnipeg. From RnD point of view, the two movies had some parallels - in DB:E, we had to grow objects using particles, in G.I.Joe we had to destroy them. While the actual implementation of the effects in production ended up quite different, the research process showed that the two tasks can be seen as complementary.
For example, some early tests called for the building of an exo-skeleton from particles drawn from the environments (dust, rocks etc.). Now if you try to use PFlow's Find Target to land particles on a moving surface, it is not impossible, but quite hard to control correctly. Whereas Thinking Particles uses an approach where the user control over a moving particle is gradually stripped away, forcing the particle to go to an exact location, PFlow just uses forces to try to get there even if the point is constantly escaping. If you look at the history of special effects in movies, shooting the action in reverse and playing it backwards has been one of the earliest and most amazing tricks of cinema. With the ability of the Krakatoa PRT Loader to easily control the flow of time using the Playback Graph parameter, simulating the EMISSION of particles from a surface together with complex forces and playing back the resulting PRT sequence to produce the build up of particles on the surface is quite easy. Try it out some time!

The production of the Nanomite effects for G.I.Joe involved two companies (Prime Focus VFX and Digital Domain) using vastly different rendering approaches to create the same look. In the end, particle (point) rendering and voxel rendering looked quite similar - at that point in time, production was locked to a build of Krakatoa that did not have Voxel Rendering yet, and I guess we wouldn't have used it even if it worked already. Assuming that a cloud of metal-eating miniature robots would produce some reflections, we added Environment Reflection Mapping support to Krakatoa, but ended up not using it for the movie. So the end users of the 1.5 commercial release benefited from this development. For a while, there was also support for Anisotropic Specular Highlights in the beta builds, but it just wasn't done right and we decided to pull it out of the shipping product. It might return someday in a better shape. If you haven't seen the "Particles In The Zoo" video by Matthias Müller, you should go watch it now - he used the Environment Reflections ability of Krakatoa to great effect, especially on the "scales" objects.

Then we had the real blockbusters - Twilight:New Moon and of course AVATAR. (I think Blogger.com should add support for the Papyrus font so we can write the name as JC intended) ;)

For the New Moon movie, Prime Focus employed Krakatoa for the "apparition effect", as well as for the foam on the wave that hits Bella. Both effects can be seen in the official trailer (at 1:10 and 1:03 respectively). Not being a teenage girl, I haven't seen the movie yet, but will probably rent it on Blu-Ray when it comes out. In the look development / RnD phase, I also tried to use Krakatoa's new Voxel Rendering for the "diamond skin" effect, but we ended up using an alternative approach based on V-Ray. The Krakatoa effect looked quite promising though and I am sure the knowledge gathered from those tests will end up somewhere else.

Strangely enough, my involvement with Avatar was mostly unrelated to Krakatoa - like with G.I.Joe, I worked on pipeline tools to speed up the production workflow. For G.I.Joe, we used a prototype of an assembly system where all assets were separated and combined only at render time, without using an actual MAX file to hold the scene. The scene was assembled on the fly for editing or rendering, then the changes were saved back to new versions of the asset files in their original locations on the network, and any change to an asset would propagate automatically throughout the sequence. No XRefs involved!
In the case of Avatar, James Cameron insisted on absolute continuity of the animation sequences shown on the 3D screens in the Bio Lab and the Ops center. So we ended up developing a database application (called SAGI = Screen Art Graphic Interface) that would keep track of shot lengths and what is seen on which screen. The 2D artists working on the screen graphics could use a User Interface to this database to enter their latest versions and request a 3D rendering of all screens affected by their entry. The SAGI application would write a control file and "drop" it into a folder monitored by the other part of the system called ASAR (short for Automatic Screen Art Rendering). ASAR was written in MAXScript and was running on a Deadline Slave as a never ending MAXScript job, checking the drop folder for SAGI request files periodically. When it would find a file, it would process it by loading the 3D assets, applying the right timing of the right textures for the Left and Right eye and submitting all necessary passes as new Deadline jobs. This made a human error in the assembly and rendering phase impossible and allowed 2D artists to trigger 3D rendering jobs without any knowledge of 3ds Max or Deadline. But most importantly, it allowed us to change the length of shots and preserve the visual continuity of the screen contents between shots without much human intervention. (You can read the official Press Release here).
That does not mean that Krakatoa wasn't involved in the making of the movie. It helped give the "holotable" terrain its distinctive LIDAR look and was seen on the "false color images" screens showing the energy flow at the Tree Of Souls. (seen below before and after)

Apropos LIDAR - about a year before I joined the company, some shots for X-Men 2 also involved a "holotable" graphics display. It is amazing how these kinds of jobs come around again and again (along with crystal growth and particle disintegration).


Another curious fact about X-Men 2 and Avatar - the former happens to be the movie with the highest Tomatometer rating Frantic/Prime Focus has ever worked on. The latter appears to be the one with the highest financial success. Reading Box Office Mojo daily makes me feel great, especially after all the online negativity before its release.

Outside of the world of VFX, the past year was marked by some of the best concerts in my life, led by the Winnipeg performance of Leonard Cohen, followed not-so-closely by AC/DC and KISS and farther behind by Metallica. (the above sentence should give you an idea of the approximate range of my musical taste). The year also included the worst concert I have been to, unfortunately by an artist I generally love, thus the disappointment was even bigger, but I won't discuss this further. It also saw my AI favorite losing the title despite performances like this, this and this (yes, I am a Glambert!) but producing an amazing album and getting to do this.

Just before the end of the year, me and my wife got to see Chaplin's movie "City Lights" with music peformed live by the Winnipeg Symphony Orchestra. Despite being in black and white and having an almost square aspect ratio, it was possibly the second best movie experience I had this year, right behind Avatar (which was in color, wide screen and 3D, of course)...

So on to a new year with lots of changes for me on the horizon and hopefully with more amazing music and a 2.0 version of Krakatoa on the market!

Happy New Year everyone!

Saturday, November 21, 2009

"Grainy, like a Krakatoa render..."

As you could imagine, I spend a little time googling "krakatoa particles" and "krakatoa render" about once a week to find out what people are doing with it and what they are talking about on various forums.
A couple of weeks ago I discovered the following (rather old) thread on the SideEffects forums related to a Houdini rendering which was described, as the title of this blog says, as "pretty grainy, like a Krakatoa render".
I guess this is a good cause for a new blog.

I must say the actual animation shown in the thread wasn't that grainy, but I am more concerned about the public image of Krakatoa. I suspect the assumption that Krakatoa has a particular look is caused by the huge amount of animations on YouTube that simply have the wrong settings.

Of course, in some cases Krakatoa is being used to produce sand or pieces of solid objects flying around. In such cases making each particle distinguishable as a dot can be desirable. In fact, the Nanomites in G.I.Joe were also rather grainy, but it was the look that was requested.

When using Krakatoa to create effects like Ink in Water or Wispy Smoke though, the rule of thumb is - if you can see a particle as a particle, your Density is too high! The main idea behind Krakatoa (even before it was called Krakatoa) was to take several hundred million particles and draw them together into the image buffer with very low density per particle to accumulate a SMOOTH final result - in the movie "Stay" where our particle rendering was first used (inspired by Doc. Baily's Spore rendering), the effect looked like glowing plasma. In the movie Cursed, we actually used FLOOD to drive millions of particles through a simulation and rendered them in the same renderer to get the Wispy Smoke seen in this PDF.

The recent "Ink" animation created by weareflink for the CCTV shows a pretty good use of Krakatoa simulating ink in water without being able to distinguish single particles. Also, the Vilnius SPA animation by DekoLT is a great example of high density but enough particles to create solid-looking clouds.

Another factor that can cause a lot of grain in the rendering esp. of solid-looking clouds is the Light Pass Density. As you probably know, Krakatoa lets you decouple the density of particles as seen by the lights from the density used by the camera to draw the particles. High density in the Lighting Pass with very high particle counts can produce self-shadowing of particles at the very surface of the cloud because the outer-most layer of particles would "eat up" all the light and the very next particle layer below them would appear very dark as opposed to the very bright lit ones. This can produce not only grain but even very undesired moire effects.

Reducing the Light Density a bit and letting more light penetrate the volume (also resulting in a sweet sub-surface scattering effect) usually solves this problem.


So please, if you are rendering in Krakatoa, make sure you crank up the particle count AND lower the Density until you get a smooth result, then play with the balance of the Lighting Pass vs. Final Pass Density to get the correct amount of light penetration and pixel coverage...


EDIT: You can find some illustrations on this new documentation page.

Have a smooth rendering! ;)

Sunday, October 25, 2009

Krakatoa 1.5 - Confusing Changes For The Better

Once again, while most of this is already covered in the online documentation, I feel that spelling it out for the few people reading my blog might be a Very Good Idea. I will probably have to update the FAQ or just link to this Blog or something like that.

First, the default lights handling.
When designing Krakatoa 1.0.0, we discovered that particle rendering with default scene lights never looked good. This was mainly because the default mode for default lights in Max is a "headlight" right behind the camera, which does not produce very good looking shadows. The alternative mode is two lights which works even worse with volumetric rendering, and requires one more light sorting/attenuation map generation pass...
So we made the decision back then to render particles as self-illuminated if no actual light node was detected in the scene. As result, one could just create some particles, hit render and get an idea where the particles were. On top of that, it worked great with Additive Mode where lighting was usually not desired (although Additive Mode + Lighting was somewhat supported).

When the Krakatoa version which ended up being released as 1.5.0 (it was initially developed as 1.2.0 and nearly shipped as 2.0.0 due to the amount of features added, but that's another story) added support for an Emission channel and an Absorption channel in addition to the Color (Scattering) channel, we had to revise this design decision. In short, having a per-particle Emission channel meant that it would be a Very Bad Idea to render particles as fully self-illuminated when no lights are found in the scene. At the same time, rendering Default Lighting was still as unusable as it was two years earlier - we looked into it again and finally decided to bite the bullet and render particles as not illuminated if there are no explicit lights in the scene.

What does this mean? In short, if you open a Krakatoa 1.1.x scene without scene lights in Krakatoa 1.5.x or just start a new Max scene, add some particles and hit render, you get... nothing. Or so it seems. Looking at the Alpha channel, the particles are there (and the log / progress dialog show that they are actually being loaded and processed). It is just that self-illumination is not applied implicitly, no scene lighting is applied either and you end up with black particles on black background (changing the background color to a brighter color shows that, too).

As you can imagine, this is one of the main problems new users of 1.5.x encounter and report on the forums and in support emails. The possible solutions are
  • Check >Override Emission and >Use checkbuttons in the Global Render Values rollout (or alternatively check >Use Emission and >Override Emission in the Main Controls rollout). The default Emission Override color is set to white, so your particles will render as white by default.
  • You can also add a Map to the Color Override in the Global Render Values if you want more interesting results.
  • If you want to emulate somewhat the Krakatoa 1.1.x behavior where each particle renders the Color as Self-Illumination, you could also add a Global Channels Override KCM and set it to Color Input>Emission Output, then check ">Use Emission" without enabling the >Emission Override option - this will copy the Color of the particle into its Emission channel. PRT Loaders and PRT Volumes will render in their actual color.
  • Alternatively, adding a Vertex Color Map to the Emission Override slot will render the Color channel and put it into the Emission channel, but this approach is generally slower compared to using a Global KCM.
  • You could of course also create a Light in the scene to illuminate the particles, but this will cause longer render times due to the illumination pass when rendering in Particle Mode.
While I admit that this seems like a step backwards, the ability to specify Emission per particle means more flexibility (as we will see later). Flipping two checkboxes or creating a light shouldn't be a big price to pay...

The second large change made to the main controls of Krakatoa was the replacement of the big >USE LIGHTING button with a much smaller >Ignore Scene Lights button. We wanted the lighting mode to be the default state of the UI and the underlying renderer, not a special mode one would have to activate. In light of (pun intended) the above discussion about emission and default / scene lights, this was the only logical way to go. On top of that, the >Ignore Scene Lights option does not switch the renderer into a different mode, it simply assumes any existing scene lights are actually turned off. (you could do that with the Light Lister, but it would be a PITA). At a certain point in the development, that option was nearly removed from the system, but it was deemed necessary to quickly produce Self-Illuminated particles using the Emission channel without taking scene lights into account.

And here we come to the third major change - Additve Rendering in Krakatoa v1.1.x used to be a distinct rendering state as opposed to Volumetric Rendering. But Krakatoa v1.5.0 implemented the full volumetric shading equation with Scattering (Color), Absorption and Emission terms, making it possible to control how much color is scattered into the eye, how much is absorbed from each color component as light passes through the particles and how much light is emitted by the particles. Additive rendering is simply fully Emissive rendering lacking Scattering and Absorption, which means that now we can define PER PARTICLE whether it should be rendered volumetrically or additively (or somewhere in between).
You can find more info here.

Knowing that our users will still want to be able to render particles additively without much clicking around the UI (Krakatoa has been doing additive rendering long before it even had that name, since around 2004), we decided to add a special option called  >Force Additive Mode. What this option does is this: it copies the Color channel into the Emission channel, while setting both Color and Absorption to black. Thus, particles do not absorb light, do not scatter any light into the eye, they just emit light and accumulate into the pixels as desired. To illustrate what is going on, the >Use Emission and >Use Absorption options are getting grayed out in that mode. In this case, scene lights are also ignored completely even if >Ignore Scene Lights is not checked - particles with black Color and Absorption would produce no lighting effects at all, while wasting time sorting for attenuation map drawing that would not be used anyway...

As you can see, we wanted to make Krakatoa a lot more flexible by providing per-particle channels for the main shading values, but this required some changes to be made to both the default behavior of the renderer and to the controls in its User Interface.


Understanding the processes going on inside the renderer and the logic behind these changes should help you master the new version and lose your old habits built in the less advanced environment of Krakatoa 1.1.x.

I have the feeling I will continue this post in the near future with some more notes on feature and UI changes, so stay tuned!...

Friday, October 23, 2009

Krakatoa Icons Are There To Make Your Life Easier

Over two years ago Krakatoa 1.0.0 shipped to customers, packaged with some useful MacroScripts (even including pictures of volcanoes). To this day, I shake my head in disbelief when I see that nobody is using them.

As usual, some back story.

Krakatoa was implemented as a .DLR, a Render Plugin like most other 3rd party renderers. Normally, a Max-compliant renderer provides one or more tabs with one or more rollouts inside the Render Dialog. But designing User Interfaces using a resource editor in Visual Studio is a task nobody really enjoys. It is so last millennium, and a real PITA. Before Krakatoa, another renderer (or rather, bridge to a renderer) was under development by the same team - Amaretto. In both cases, the decision was made to split the UI and the core code and implement as many controls as possible using MAXScript. This had several positive implications:
  • The UI could be developed A LOT faster.
  • Fixes to UI bugs could be made without even restarting Max.
  • Fixes to UI bugs could be distributed to customers by simply replacing an .MS file without recompiling the DLR, often the same HOUR the bug was reported. In fact, some fixes during Beta were even distributed by telling the user what line to edit or remark (Do It Yourself fixing!)
  • Customers could replace portions of the code or modify the UI if desired, as well  as read the UI scripts and learn how to access all internal properties of the renderer. All scripted components ship unprotected.
  • Last but not least, the UI design and implementation could be taken over by someone outside of the RnD team, someone who uses the software in production and knows a bit of MAXScript... That's how I got involved.
This design decision for Amaretto carried over to Krakatoa. In fact, the original UI of Krakatoa written by Mark Wiebe, the "father" and "godfather" of the volumetric particle renderer, was an edited version of the Amaretto UI script file. Krakatoa was already in early Beta when I tried to use that UI and immediately felt the desire to modify it.

One negative side effect of all this was the inability to display the UI inside the Renderer tab of the Render Scene dialog. It is simply not possible to add scripted rollouts to that tab. So the solution was to add a big fat button to that rollout which would then open the scripted UI of the renderer. This was, of course, a slightly inconvenient and quite untypical workflow.

At this point in time, Max 6 had come and gone and the Production/Draft slots of the renderer were replaced by Production only and the ability to save/load render presets. We started discussing ways to keep Krakatoa assigned as the renderer while being able to switch to Scanline, mental ray, VRay, Brazil, you name it. We did not want to lose the current settings of Krakatoa, but the fact that Krakatoa required cooperation with another renderer to produce the final image (including casting shadows from particles onto geometry) meant that we had to make it easy for the user to swap back and forth.

To my surprise, the Render Presets turned out to work great with Krakatoa despite the fact Krakatoa doesn't even use Parameter Blocks like any other Max plugin (our developers found Parameter Blocks clunky and not very stable when changing their structure between versions, so both Amaretto and Krakatoa use a custom string-based storage for all non-animatable / non-object-related properties). So saving and loading a Render Preset with Krakatoa restored everything perfectly.

As result, I decided to create two MacroScripts that would:
  • Assign Krakatoa as the current renderer if it isn't the current renderer yet when the "Toggle GUI On/Off" is checked, while storing the last renderer's settings in a Render Preset before assigning Krakatoa.
  • Toggle the Krakatoa UI on and off when the main icon is checked/unchecked.
  • Load the last renderer's settings from that preset if the "Remove Krakatoa" button is pressed, while saving the current Krakatoa settings to another Render Preset
  • Load back Krakatoa via that Render Preset if the Toggle button was checked again.
  • Provide a prompt to let the user reset Krakatoa to factory defaults instead of loading a previous Render Preset if desired.
These MacroScripts can be found along with a dozen other useful buttons in the Krakatoa category of the Customize User Interface. They are the MAIN BUTTONS to access Krakatoa, much faster and more powerful than the Open Krakatoa GUI and Assign Renderer controls in the Render Scene Dialog.

The fact nobody seems to use them makes me sad. It even makes me write Blogs, as you can see here...

So I beg you, if you are a Krakatoa user, spend 2 minutes to create a Krakatoa toolbar and drag the most vital icons to it. Once you start using them, you will be
  • Orders of magnitude more productive
  • A Happier Person
The available icons are documented here, in the Fine Manual Nobody Reads:
http://software.primefocusworld.com/software/support/krakatoa/macroscripts.php

Some of the other icons provide one-click creation of PRT Loaders, one-click creation of PRT Volumes from any number of selected geometry objects, one-click assignment of KCMs to any number of selected supported objects, as well as buttons to access the Krakatoa Log Window, the Shadows On Geometry utility, and the ONLY way to access the Krakatoa Schematic Flow tool introduced in v1.5.1.

Obviously, you can even assign these MacroScripts to keyboard shortcuts if you want to be even faster!

This information is already in the Online Documentation, but I hope a Blog will be more effective in spreading the word. To quote a sign that hangs in our office, "Only YOU Can Prevent Render Madness!" Make your life easier. Use the shortcuts!

Sunday, September 27, 2009

Lost Gems in MAXScript - Forcing Global Scope Access

A thread on CGTalk just made me realize that there is a hidden gem in the MAXScript syntax barely anyone knows about, let alone uses. Since it is in part my fault that people don't suspect it is there and because the next update of the MAXScript documentation is quite a few months away, I feel it is a good idea to mention it here.

If you search for "Global" in the help, the corresponding topic will only have rank 22, and the page that links to it is rank 8, but I don't expect anyone to actually read it. I promise I will reorganize these topics next time around, at least the ones related to Scope of Variables, because this is the biggest source of errors and misunderstandings in the programming practice.

Some background info first:

MAXScript has a couple of peculiarities which make coding more "relaxed" (in that the rules are more relaxed for non-programmers), but as a side effect cause a lot of problems in certain situations:

*First of all, variables do not need to be declared before being used. You can use a variable name even if it was never declared or defined, and it will have the value of 'undefined'.

*Second, variables do not require an explicit scope declaration. If a variable is used in the top level context (Listener, or a script without any parentheses), it will be automatically assumed global. If a variable is used in a context below the global (inside parentheses), it is assumed local. This is unless the developer explicitly declared it as either type. For many years, I skipped the explicit scope specification because I knew what I was doing. In the last two years or so, I started forcing myself to declare each variable as explicitly global or local to make it clearer to others - if you read the source of the Krakatoa GUI, you will notice.

Now the trouble from these behaviors is this: If you have a script calling a function or using a variable which was supposed to be defined in the global scope by some other script file, if that other script happened not to be evaluated yet, your call would cause the function to create an undefined local variable instead at evaluation time. Even if the script defining the actual function would get evaluated later, the function call would still be looking at the local variable and seeing undefined, causing an error at run time like "Call needs function or class, got undefined".

Everybody with a little MAXScript knowledge knows that the solution is to pre-declare the global variables in both scripts. This way, when the script calling the function is evaluated, the call will not create a new local variable if the function does not exist yet, but will check the local and global scopes for that name, find it in the global scope and set the pointer at that memory address. When the actual function definition is evaluated, it will also find the existing global variable and will replace the undefined value with the actual function, making the first script operational and avoiding the error.

So here is the little hidden gem most people don't know about - the double-colon :: syntax.

In the topic "Specifying Global Variables As Global Using ::" linked from "MAXScript Language Improvements in 3ds Max 8", it is demonstrated that you can prefix any variable with :: and this will force it to look ONLY in the Global Scope, completely ignoring any local scopes!

In other words, whenever you intend to call a global function or access a global variable but you are not sure whether it is already defined at the moment of script evaluation, instead of pre-declaring the variable as global in the script to ensure it is "visible" to the caller, you can simply prefix it with ::!

You can see some examples I posted in this CGTalk thread:
http://forums.cgsociety.org/showthread.php?f=98&t=810878 

You might say: But Bobo, Global Variables are EVIL! Why would you use them?
The answer is that they have their positive role when defining libraries of functions, for example a single struct definition with many functions to be accessible by many other scripts. Such libraries are normally stored in an .MS file saved in the Stdplugs\Stdscripts folder or subfolders thereof. Since that folder is the first location to be auto-loaded by MAXScript at startup, these structs and functions would be visible to any other scripts launched later.

Unfortunately, if two scripts are placed in the same folder and the one defines global "library" functions the other script should access, ensuring the loading order of these scripts become tricky and usually involves fileIn() or include() calls from another script, making things quite complicated (this is a whole other topic to write about). Thus, using :: makes the problem go away and lets you call a function or access a struct you know exists or WILL exist in global scope regardless of the script files loading order!


I believe the "Specifying Global Variables As Global Using ::" topic should be merged with the "Scope of Variables" topic or at least extensively linked from all relevant topics. I apologize that this is not the case yet and hope you will find this little known feature useful!

Happy coding!

Tuesday, September 15, 2009

A Quarter Century Of Geekdom

I've got news for you - I am a geek. Chances are if you are reading this, you are in the same boat. And I just realized I am an OLD geek. That's right, it is September 2009, which means that pretty much exactly 25 years ago (+11 days) I touched a computer keyboard for the first time in my life!

The fact I still remember the exact date (September 4th 1984) is just one of the many evidences of geekiness, but since it marks the beginning of my life behind the screens, it could be presented as evidence "A". Of course, there were some other things in my early life that prepared the soil - the fact that I was very into reading (with the occasional attempts at writing) Science Fiction where computers used to live in the 70s (until the birth of the first Apple later that decade) and that I was crazy about Star Wars. I was so into it I actually wrote a rather long poem depicting everything happening in The Empire Strikes Back when I was 15 - since I know at least one Bulgarian is reading this blog, here is the link ).

In those early days behind the Iron Curtain, nobody expected a political change in our lifetime. In fact we were brainwashed to assume the status quo could never change - the Soviet Union had existed for almost 70 years and the Bulgarian anthem contained a line about how "Moscow is with us in peace and war". A nuclear holocaust appeared more probable than a political revolution. Thus the mere idea of working in the area of Hollywood visual effects at any time in the future was quite out there, and of course it never came to my mind. But the desire to do something creative including computers and futuristic space ships was quite natural and I was really very surprised when despite my best attempts to do something else with my life, I ended up doing just that... In a way, I believe that I am living the dream that I never had.

On that fateful September day of 1984, a schoolmate of mine told me about a computer club. Like everything in those days membership was completely free. All I had to do was show up at 8 in the morning and register. My friend also gave me a 4 pages introduction to computers and BASIC programming and after swallowing the whole info before going to bed, I couldn't get any sleep because my brain was trying to combine the few graphics-related commands it knew into something resembling a Zaxxon-like diagonal scrolling game. When the course started in the morning, our teacher told us about an expo with free access to computers that was going on in a big concert hall in Sofia (the expo was called TNTM, short for "Technical and Scientific Creativity of the Youth" in Bulgarian). I went to that hall and spent some of the most exciting days of my life there - I felt something big was happening in my life, but I had no idea how big.

Here is something else interesting to consider. In those days Bulgaria had a population of about 8 million, so the chances of being born Bulgarian were pretty small statistically speaking. Every country in the communist block had an industrial specialty in order to avoid duplicated efforts in multiple countries. For example the USSR was producing most of the cars and planes, while Bulgaria specialized in building forklifts and... computers. In the early 80s, a clone of the Apple 2 was "created" in Bulgaria, a couple of years later the building of PC clones started and most of them were exported. But it also meant that hardware and software were available and the education of specialists in these areas was part of the state policy. (When the Perestroika happened, around 1988 almost half of the computer viruses in the world were Bulgarian, a sad result of having too many specialists with nothing to do). What I am trying to say is that no matter what I did or where I went, computers were around me and appeared to be stalking me... Was I meant to be doing what I am doing? I think so.

So that's how it all started for me. A quarter of a century later, I still get the same excitement when I sit behind a computer, with the small difference I have a slightly better idea what I am doing.

Food for thought: Apple II used a CPU running somewhere between 1 and 2 MHz. My next computer, the Sinclair ZX Spectrum, used a 3.5 MHz CPU. 25 years later I am working on 8 cores with 2.66 GHz each. That is a really nice curve, Mr. Moore.