json - Performance impact of instanced objects in Javascript -
i have application calculates , stores data in following way (obviously case here has been simplified, in reality there many more properties in inner object):
var processeddata = []; sourcedata.foreach(function (d) { processeddata.push({ a: geta(d), b: getb(d), c: getc(d) }); }, this); function dostuff(row) { // stuff }
the number of objects created here can high (thousands), performance fine current approach in wider context think improve code readability , testability if moved more defined object format:
var row = function (a, b, c) { this.a = a; this.b = b; this.c = c; this.dostuff = function () { // stuff } }; var processeddata = []; sourcedata.foreach(function (d) { processeddata.push(new row( geta(d), getb(d), getc(d) )); }, this);
there 2 elements i'm worried here, 1 performance/memory cost of constructing instanced object new. second memory cost of including function in object have thousands of instances. i'm not sure how clever javascript kind of thing.
re organization bit prototype. 1 function instead of thousands
function row (a, b, c) { this.a = a; this.b = b; this.c = c; } row.prototype.dostuff = function () { // stuff }
i suggest use for, instead of foreach. it's not sensible small collection, big ones make sense
it depends on how you'd work collection. if don't bother sorting, grouping,etc. stuff, need random key access -you can try create hash object, having field key below
function getvalue(key) { return hash[key]; } var hash = { "key1" : { "a" : 1 , "b" : 2 , "c" : 3 }, "key2" : { "a" : 1 , "b" : 2 , "c" : 3 } };
not sure geta, getb, getc - can re engineered
hope helps
Comments
Post a Comment