Skip to content

Tamarin Part I.I – AS3 Array (bug)

December 14, 2009

Thanks to the analysis Jackson Dunstan, I was able to find a “bug” in the Array management.

I did an article about array data structure in AS3 a few days ago, and Jackson did some more test on the array after that.

From the Tamarin code, I was able to conclude that removing an element from the aray would split it in two parts, a dense array and a HT.

And with the checkForSparseToDenseConversion function, Tamarin was connecting back spliced parts of the arrays.

But the thing is: No lower limit for the HT is set when splitting the array in two.

Here is the ArrayClass bug i submitted to Adobe Bug Base:

// We're deleting an element in the middle of our array.  The lower
// part can be left in the dense array but the upper part needs to
// get moved to the HT.
else
{
	for (uint32 i = index + 1; i < getDenseLength(); i++)
	{
		ScriptObject::setUintProperty (i, m_denseArr.getAtFast(i));
	}
	m_denseArr.splice (index, 0, (getDenseLength() - index), 0);
}

There is absolutly nothing in there to reset the low barrier!
It would be very easy to insert that low barrier update there because we know that the next index is valid. (lowHTentry = index+1)

Hence, checkForSparseToDenseConversion is comparing wrong values, and can’t find the values that should be inserted back in the dense array.
The m_lowHTentry is invalid so it can’t enter the while statement

void ArrayObject::checkForSparseToDenseConversion()
{
	// check for lowHTentry being consumed
	if (m_lowHTentry == NO_LOW_HTENTRY)
		return;
		if (getDenseLength() != m_lowHTentry)
		return;
	 while (getDenseLength() == m_lowHTentry)
              {

So how to make back the connection?

var arr:Array = new Array();
//Init All value
for (var i:int = 0; i < 1000000;i++) { arr[i] = 33; }
//Delete an element, puting all subsequent to the HT (But don't reset low HT index)
delete arr[0];
// Set an element (it's not part of Dense so overwrite HT entry)
// Since it's lower from previous low HT index, set low to this index.
arr[1]=33;
// insert in DA and checkForSparseToDenseConversion (now that low HT index means something, it can get all data back in DA)
arr[0]=33;

The “delete” statement cost a lot because it’s putting all values in the HT. But the last statement (arr[0} = value) cost a lot too! Parsing the whole HT to add value to the DA! (what we didn’t see in Jackson’s tests). His conclusion was that to use the DA after deleting an element, the only way was to set all item back to their values.

But with a better understanding (thanks Tamarin) of what’s going on, the solution is more (set arr[1] first to reset the low barrier (it can even be arr[1] = arr[1];), then set your value arr[0]).

I’ll be filling a bug report on adobe bug base tonight about this problem.

Thanks Jackson for challenging this post!

*Edit 2009/12/15*
I just added the bug in Adobe Bug System: FP-3477

From → general

2 Comments
  1. Again very interesting! I’ll have to performance test this when I get back to the machine I did my previous tests on. I’ll bet it’s a ton faster than re-filling the whole Array. Keep up the good work!

  2. I just added the bug in Adobe Bug System: FP-3477
    https://bugs.adobe.com/jira/browse/FP-3477

Comments are closed.