Manual OD values gets registered higher in calibration

During OD calibration, the “manual spectrophotometer OD” entries are registered higher compared to the pioreactor measurements causing two different lines to be developed as shown in the picture below:

Changing the HAT unit on the Pioreactor seems to solve the issue.

Does anyone know what if this is hardware issue? Any advice on how to troubleshoot is appreciated.

Hi @sharknaro

What do you mean by “changing the HAT unit”? Like physically swapping the HAT for another HAT? I haven’t seen anything like this before. It might be a software bug instead.

You have a lot of manual samples in your data, which is fine, but there could be an edge case here. Can you tell us what combination of parameters you are using at the start of the OD calibration?

Anything else you can share about the process? Are you diluting the samples for the manual OD600?

We recognize our OD calibration needs some attention, so we’d be happy to hear feedback about the algorithm (and you’ve already provided some feedback in Advice for OD calibration range).

Hi Cameron,

To answer your questions:

Yes, I swapped the HAT units with another HAT

Sorry for not providing details on this! So my calibration parameters were:

  1. Max OD = 3.5
  2. Min OD = 0.02
  3. Dilution rate = 4 mL

For these parameters, the software asked me 18 point measurements, with an option to enter manual “spectrophotometer” measurement after every 4+4 = 8 mL media addition

I calibrated 16 Pios at the same time, and none of them had an issue, expect two Pios which had this issue until I swapped the HAT units. Because of this, I would consider this issue to be hardware related rather than software. Though, I did not have time to test the removed HAT units on other Pios so see if the same issue persists.

I can run some tests this weekend and update you on that.

I diluted the initaly two samples because they were above 1 (OD >1). However, for the rest I was directly measuring (so without dilution)

I do not know what additional information I can provide, so if I am missing something please let me know and I will be happy to elaborate

  1. Can you send us the results of a good calibration (from one of the runs that succeed)? ie the output of pio run od_calibration display

  2. Had you run a self-test on the Pioreactors beforehand? If so, did they pass?

One thing we are curious about is the almost one-to-one nature of the “manual” OD values with the “inferred” OD values in the chart. For a 10ml start, 4ml dilution, we expect there to be 2-to-1 inferred to manual ratio. Does that make sense?

Here is the data for a successful calibration

{
“type”: “od_90”,
“created_at”: “2024-04-06T15:44:47.370356Z”,
“pioreactor_unit”: “P07”,
“name”: “od-cal-2024-04-06”,
“angle”: “90”,
“maximum_od600”: 3.5,
“minimum_od600”: 0.006,
“minimum_voltage”: 0.022968408136512176,
“maximum_voltage”: 3.5133738521462625,
“curve_type”: “poly”,
“curve_data_”: [
-0.07447459801429754,
1.2727848927491558,
0.014597710176138248
],
“voltages”: [
3.5133738521462625,
2.811935213414734,
2.1315097991251517,
2.091689269093525,
1.5210639433462623,
1.156651410925214,
1.142835162859364,
0.762304168393797,
0.5730418512034993,
0.5872258216525401,
0.40551942036141825,
0.3186521807097925,
0.3052613380685405,
0.2240220041962817,
0.1781993794927824,
0.1725932837868048,
0.13150771376980047,
0.10409852238535953,
0.10493012509821495,
0.08642839086756976,
0.06906339515624996,
0.06630502205993102,
0.06332040625845456,
0.04694884702612076,
0.022968408136512176
],
“od600s”: [
3.5,
2.5,
1.9444444444444444,
1.7,
1.2142857142857142,
0.9444444444444444,
1.0,
0.7142857142857143,
0.5555555555555556,
0.42,
0.3,
0.23333333333333334,
0.25,
0.17857142857142858,
0.1388888888888889,
0.14,
0.1,
0.07777777777777778,
0.07,
0.05,
0.03888888888888889,
0.035,
0.025,
0.019444444444444445,
0.006
],
“ir_led_intensity”: 50.0,
“pd_channel”: “2”
}

Yes I did and all the Pios including the ones with the issue of this topic passed all the tests.

I assume by sayin “ratio” to refer to the number of OD values registered. If that is the case, yes for a 10ml start, 4ml dilution, there is 2-to-1 inferred to manual OD ratio(i.e., I registered 1 “manual” OD value after 2 “inferred” OD values). Let me know if I understood it correctly.

EDIT: I would like to state that, I do not have any data for the “faulty” HATs as I had to stop calibration before saving the outcomes, Hence, I only have the picture I provided at the beginning.

Hi @sharknaro,

I assume by sayin “ratio” to refer to the number of OD values registered. If that is the case, yes for a 10ml start, 4ml dilution, there is 2-to-1 inferred to manual OD ratio(i.e., I registered 1 “manual” OD value after 2 “inferred” OD values). Let me know if I understood it correctly.

Yes, that’s right - but the first chart displayed seems to have the same amount of “manual” data to “inferred” data (you say “two different lines to be developed” - I assume the top line is all manual data, and the bottom is inferred).

One more question: are you gathering the manual spectrometer od600 for each calibration, or do you have a “master” sample that you run in the spec, and then use that value in all the Pioreactors?

We are a quite stumped here, and can’t really explain why this happened. We actually don’t think it’s a hardware issue. We’re going to create an issue around this and detail some of our working theories (none of which have much support).

I know it wasn’t saved, but if you have any more data from the first run you posted, we’d happily accept it.

Hi again Cameron,

Sorry for missing a small detail! According to my notes, I used dilution volume of 5 mL for the “faulty experiment” I mentioned above, and when you use 5 mL, the ratio gets lowered to “1-to-1” for “manual-to-inferred” OD values. This is the reason for having the same amount of “manual” data to “inferred” data.

The way I do the calibration is that:
1. I open a separate command line interface (CLI) per pioreactor
2. Prepare a bacterial solution with MAX OD value
3. Measure OD on one Pioreactor and jump to the next Pioreactor
4. Repeat this for all other Pioreactors to complete the first measurements
5. Return back to the initial Pioreactor and start measuring second OD values
6. After 2 dilutions (2 x 4 mL addition) take a sample from the vial and measure the OD on the spectrophotometer
6.1 If sample OD in the CLI stated to be > 1, take 100 mL vial sample and dilute in 900 mL media to measure the OD
6.2 If sample OD in the CLI stated to be < 1, take 1 mL vial sample to measure the OD
7. Repeat step 3-6 until the calibrations is complete

So, I gather spectrophotometer OD600 measurements (“manual” measurements) by sampling from the same vial I use for calibration, and I gather measurements everytime, I am prompted the option “enter the manual OD600”, which occurs after adding a total volume of 8 mL to the Pioreactor, if you start with an initial vial volume of 10 mL, so to say:

  1. For 2 mL dilution volume: Measure manual OD600 after 4 x 2 mL (8 mL) addition
  2. For 4 mL dilution volume: Measure manual OD600 after 2 x 4 mL (8 mL) addition
  3. For >4 mL dilution volume: Measure manual OD600 after every media addition, as the volume remaining after 1 x >4 mL dilution is not enough (i.e., exceeds 8 mL) for another dilution step.

I am going to run calibration again today, I will also include the the “faulty” HATs to see if the issue still persists. In either case, I will save the data and provide it in the form.

Sorry for this late catch up for my last promise of trying the “faulty” HATs

I updated all the workers to the lastest software version (in this case 24.4.11) and after that, I used the “faulty” HATs for running calibration, which worked perfectly fine without any issue!