STARLIMS + Google Data Connector = lazy lab data

Yup; almost a full year without a post: shame on me. But here I am, with something cool for you STARLIMS users/developers/adminstrators!

Recently, I was looking for a good reporting tool – thinking to replace SAP’s Crystal Reports – and somehow stumbled upon Google Data. Google Data is kind of a mix in between Microsoft Power BI, Tableau, QLik and any and all Data visualization/analytics/findagoodbuzzword tools.

… Except it is free. Kind of.

DISCLAIMER: This project is in progress and I do this on my free time at home. I do not want to release the google script code just yet, but I do want to share progress and see where it takes me. Please contact me directly if you wish to know more and want to use this. Right now, this is closer to prototype than anything else.

There are good tutorials on writing your own Data Connectors. Google is really working hard to get developers on board. Of course, since this is not a very well known tool yet, it is still quite tricky to get going, even when following the said tutorials.

But chances are that if you read this, you know me. And you know this is the kind of things that intrigues me. Challenge! New playground! Undiscovered lands! So here I am: I have given it a try, and I will share my work here. Freely. Eventually, who knows; maybe someone somewhere will use it!


Given the nature of STARLIMS, you can kind of reverse-engineer the code so I will, aside from giving you here a (somewhat) working solution, explain what I have done and share the connector’s code so you can make it better.

To know more about STARLIMS, visit their web site:

To learn how to build your own Google Data Connector, head to

That’s where I started.

Google Data Connector

Now, what’s behind a Google Data Connector? Basically, there are 4 functions really:

  1. Authentication. For STARLIMS, I am using the Username and Password approach. I pass these in headers, so if your connection is secured (HTTPS) then you’re good. If you use normal STARLIMS authentication, then this will work for you. Where I work, we use AD, and it works fine too.
  2. getSchema. This is critical – we need to provide the connector, at the beginning, the full schema of the data. Given the size of STARLIMS, I am opting for either
    1. Predefined entities (Folders, Orders, Ordtask or COC)
    2. QBE Templates (code-based)

I think I can manage the schemas from these quite easily. I need to know field names, labels and types, whether they are measures or dimensions; that’s it.

3. getData. This is tricky. This will pass a sub-set of the required fields, and the data must match the subset of these fields. It can be done, but one needs to create the right SQL statements based on the schema.

4. getConfig. This ties everything together. The way I built it is to define the URL of your connection (aka STARLIMS URL) and whether you will use a specific QBE or a predefined entity. When you first create a connection, it will require username and password. I am not storing these, google is; it is up to you then to decide if you trust google or not with the data…

And that’s it, really! This is the kind of charts you can do in a few clicks:

Of course, these are very simple / silly examples. But it is to show that you can create viz easily, and good looking ones at that!

My idea / goal is to extend my connector and code so you could create dynamic dashboards on your STARLIMS Data using existing QBE templates you spent hours / days to optimize and clean. I think this will be definitively worth it.

You can get started here. I will update the STARLIMS Code and enhance the connector. For now, all I do is a simple query on FOLDERS and ORDTASK to kind of have a proof of concept; it actually works quite well!

Let’s see where this gets me in the future!



P.S. Remember: this is work in progress and not production-ready!

Identify Server Script Performance Bottlenecks

This week was Abbott Informatics’ APAC Forum. Speaking with old colleagues, I got inspired to try to revive this site for the 4th time (or is it the 5th?).

I’m currently sitting in a performance enhancement session and thinking to myself: heh, that’s NOT how I would go about it (sorry guys!). Silver bullets? Nah.

The first step to improving performances is to identify what is slow (duh!). What are the bottleneck? Why is it slow?

As a STARLIMS developer, I know that oftentimes, the code written in there is not necessarily the most efficient one. Therefore, why not start by monitoring the performances of, let’s say, SSL scripts, which represent the backbone of the business layer?

I’m thinking: why not have a simple tool that will record, like a stop watch, all block of code execution time, and then provide a report I can read? Heck, .NET has a StopWatch class! Hey! STARLIMS IS .Net!

The more I think about it, the more I consider: let’s do it!

How do we do this?

First, let’s create a class. I like classes. I like object-oriented code. I like the way it looks in SSL afterward, and it makes it way easier to scale through inheritance later on. Question is: what should the class do?

Well, thinking out loud, I think I want it to do is something like this:

  1. Start the stop watch
  2. Do something
  3. Monitor event 1
  4. Do something
  5. Monitor event 2
  6. Do something else
  7. Monitor event x
  8. so on and so forth
  9. Provide a readable report of all the event duration

I also want it to count the number of time an event run, and I want to know the AVERAGE time gone in there as well as the TOTAL time this event took.

Now that I know what I want, let’s write the class that will do it (for the sake of this example, I created it in the Utility global SSL category).

:CLASS perfMonitor;

:DECLARE oStopWatch;
:DECLARE nLastCheck;
:DECLARE aEvents;
:DECLARE WriteToLog;

:PROCEDURE Constructor;
:DEFAULT bAutoStart, .T.;
Me:WriteToLog := .F.;
Me:nLastCheck := 0;
Me:aEvents := { {"Event", "Total Duration", "# of calls", "Avg Duration"} };
Me:oStopWatch := LimsNetConnect("System", "System.Diagnostics.Stopwatch");
:IF bAutoStart;
Me:WriteToLog := .T.;

:DEFAULT sMessage, "";
:DECLARE nElapsed, nDuration, i, bNew;
:IF Me:oStopWatch:IsRunning;
	nElapsed := Me:oStopWatch:ElapsedMilliseconds;
	nDuration := nElapsed - Me:nLastCheck;
	:IF Me:WriteToLog;
		UsrMes("Performance Monitor ==> " + LimsString(nDuration) + " ms. Message: " + sMessage);
	Me:nLastCheck := nElapsed;
	bNew := .T.;
	:FOR i := 1 :TO Len(Me:aEvents);
		:IF Me:aEvents[i][1] == sMessage;
			Me:aEvents[i][2] += nDuration;
			Me:aEvents[i][3] += 1;
			Me:aEvents[i][4] := Me:aEvents[i][2] / Me:aEvents[i][3];
			bNew := .F.;
	:IF bNew;
		aAdd(Me:aEvents, { sMessage, nDuration, 1, nDuration });

Me:Monitor("Internal Restart");

Me:Monitor("Internal Stop");

Me:Monitor("Internal Start");

:RETURN BuildString2(Me:aEvents, CRLF, "	");

The above gives us an object we can start, restart, and monitor events (messages). At the end, we use typical ToString() and will have our “report”. Example of using this:

:DECLARE oPerformanceMonitor;
oPerformanceMonitor := CreateUdObject("SGS_Utility.perfMonitor");
lWait(1.3); /* fake doing something that takes time;
oPerformanceMonitor:Monitor('Step 1');
lWait(0.8); /* fake doing something that takes time;
oPerformanceMonitor:Monitor('Step 2');
lWait(1.1); /* fake doing something that takes time;
oPerformanceMonitor:Monitor('Step 3');
lWait(1.45); /* call Step 1 again to generate an aggregate;
oPerformanceMonitor:Monitor('Step 1');
:RETURN oPerformanceMonitor:ToString();

If I run the above, the output will look like

Event	Total Duration	# of calls	Avg Duration
Step 1	2025	2	1012.5
Step 2	1015	1	1015
Step 3	1011	1	1011

I have been using this in many places in our system and it did help me to find the best places to optimize our code. Sometimes, the same insert will run 500 times and will total up to 15 seconds; that is worse than one call that runs only once and take 3 seconds (at least for the end user).

I hope this can help you find the bottlenecks of your SSL code!

c# Take a WebPage Snapshot

Alright, it’s been a while, and this time around, I have something good! I have this situation where I want to take a screenshot programmatically from a web page. Although there are many examples out there for thumbnails, none really matched what I needed. I wanted to provide:

  • A maximum Width
  • A maximum Height
  • URL
  • Header (in case you need a special authentication header)

Turns out that everything was out there, scattered pieces here and bits there.

So I tinkered the whole thing together, and came up with something that works actually quite good! I still have one challenge left though: how can I put that in a DLL I can reuse and not get a threading issue? That is the question.

Here’s the code – pretty much self explanatory.

/// <summary>
/// Take a snapshot of a web page. Image will be truncated to the smallest of 
/// - smallest between rendered width and maximum width
/// - smallest between rendered height and maximum heigth
/// </summary>
/// <param name="webUrl">URL to take a snapshot from</param>
/// <param name="authHeader">Authentication header if needed</param>
/// <param name="maxWidth">Maximum width of the screenshot</param>
/// <param name="maxHeight">Maximum height of the screenshot</param>
/// <param name="output">output image file name</param>
static void WebPageSnapshot(string webUrl, string authHeader, int maxWidth, int maxHeight, string output)
    Uri uri = new Uri(webUrl);

    WebBrowser browser = new WebBrowser();
    browser.Size = new System.Drawing.Size(maxWidth, maxHeight);
    browser.ScrollBarsEnabled = false;

    browser.Navigate(uri, "", null, authHeader);
    // This is what will make this render completely
    while (browser.ReadyState != WebBrowserReadyState.Complete)
    using (Bitmap bitmap = new Bitmap(width, height))
        Point location = webBrowser.Location;
        webBrowser.DrawToBitmap(bitmap, new Rectangle(location.X, location.Y, webBrowser.Width, webBrowser.Height));
        bitmap.Save(output, ImageFormat.Png);

Circular Network Relationship

Sometimes, one may want to try to find relationships, let’s say, between products. When thinking about this, I thought “Hey! Wouldn’t it be cool if I could sort of create a circle of all the products, and then link them together as a network, so I would see who has both product A and product B?”

And, if you’re familiar with Tableau, that’s the kind of question that leads from one thing to another… Now, don’t get me wrong: I don’t think the below is a good visualization. But it was a great challenge!


The key here is to remember how Sin(), Cos() and the infamous Pi() functions work so you can layout the products in circle. Then, you need to do some self-join to have those relationships

Part 1: Circle Coordinates (back to the basics!)

1. you need to pick a field you wish to layout on your circle. I recommend creating a calculated field named “Radial Field”, so the next steps are easier. In my case, I’ll use [Product Name].

2. Calculate the X position using this formula: COS(INDEX()*2*PI()/window_sum(countd([Radial_Field])))

3. Calculate the Y position using this formula: SIN(INDEX()*2*PI()/window_sum(countd([Radial_Field])))

4. Put your X on Columns, and Y on Rows

5. Change the table calculations of both fields to compute using “Radial Field”

6. Put “Radial Field” on the details pane (and on the label).

You should end up with something like this:

From there, I suggest you play with this a bit. I will write part 2 soon to get to the next step.



Radar Chart in Tableau without R

Hey guys,

I’ve been trying to find out how to do a radar chart in Tableau without using R. I found this this post on The Information Lab which gives a good idea, but is outdated and is not really what I was looking for.

I did some research, and after learning about the new Level Of Details feature in 9.0, I realized I could do it quite easily!


The big idea is you need to first find the max radius fixed on the point – meaning it will find the max value per point and use that for that axis. Then, you just device the value by that, and you get the relative position. Finally, you do some trigonometry, et voilà!

If there is any interest, I’ll get into the details on how to do this soon.

Me vs World Growth

Hi there!

I have been setting up a Tableau Server sandbox on my personal systems, and I wanted to put it to the test. It’s a pretty silly / simple dashboard, but it can be quite interesting and tells a little story about myself.

Obviously, there’s nothing too awesome about the chart itself, but it shows the integration and the simplicity of it. In this case, there’s obviously not much security around it, but for the purpose, I guess it’s all right.

Tableau? What’s that?

Tableau? What’s that?

If you stumbled here by mistake (or out of curiosity or both), then you might be wondering what IS Tableau.

Tableau is a product made by Tableau Software which allows people to visualize their data. This means putting the data on charts; all kind of charts; and to interact with the data. It’s fast, fun to use, can analyze any kind of data (you can even build your own connectors!), interact with the data, share your insights instantly, and even connect to real time data.

If you wish to learn more about the product itself, I recommend you to start there.

There are a few pieces to the whole thing:

  • Tableau Desktop is the tool to build workbooks (content)
  • Tableau Server is the server to host these workbooks
  • Tableau Public is a desktop/server combo that is public, you can host your workbook there (think cloud), but it’s also visible by everyone, so no confidential data there.
  • There’s a mobile version (ipad and Android) which allows you to view /edit server content.
  • There’s also another version in development, called “Elastik”, which is a tool meant to allow users to analyze anything quickly on mobile devices.


I am involved and interested in Tableau products because I integrate an OEM verison of these tools with the flagship product of the company where I work.