Brian recently started a new job. That means spending some time poking around the code base, trying to get a grasp on what the software does, how it does it, and maybe a little bit of why. Since the users have some complaints about performance, that's where Brian is mostly focusing his attention.
The "good" news for Brian is that the his predecessors were "geniuses", and they came up with a lot of "clever" solutions to common problems. The actually good news is that they've long since moved on to other jobs, and Brian will have a free hand in trying to revise their "cleverness".
void ReportInstance::WriteData(SQLConn & conn)
{
XSQL_QUERY("delete from report_data where report_id = " << GetInstID(), conn);
XString sXML(GetDetailAsXML());
{
XBuffer buff(sXML);
buff.ZipCompress();
sXML = RawToHex(buff.GetBuff(), buff.GetSize());
}
int iSize(sXML.GetLength());
int iRow(0);
for (int i = 0; i < iSize; i += 248)
{
XString sFrag("");
if ((iSize - i) > 248)
{
sFrag = sXML.Mid(i, 248);
}
else
{
sFrag = sXML.Mid(i);
}
XSQL_QUERY("insert into report_data (report_id, seq, chunk) values ("
<< GetInstID() << iRow << ZString("[" + sFrag + "]")
<< ")", conn);
iRow++;
}
}
Even just skimming this code sets my eye to twitching, mostly from the number of XML related objects in it. This is a "clever" solution to the problem of running a report and saving the results.
Run the query, and capture the results as XML. Take that XML, run it through zip compression. Then, split the zipped content into 248 character chunks, and save those back into the database.
This elegant solution is easily reversed to reassemble the report data. Even better, this removes the challenge of dealing with obscure and difficult database datatypes like blobs. The chunk
column in the database is, as you might expect, VARCHAR(250)
.