How hard is it to 3D scan a lightbulb? Part 1

Last year whilst curating the temporary Digital Frontiers exhibition in the brilliant Octagon gallery at UCL, both myself and Nick Booth (UCL curator of the science and engineering collections) became a bit obsessed with light bulbs.  From this slightly odd obsession, and thanks to a kind research grant from the Institute of Making, Nick and I get to play with light bulbs and call it research.

We’re looking at the process of materials and making using 3D scanning and printing to see if creating new objects encourages a closer inspection and deeper understanding of historical objects – in our case light bulbs.  Neither Nick nor I claim to experts in 3D scanning museum objects, we wanted to see what we could do with the bare minimum of training on the devices.  How easy is it for a relatively normal person to scan and print museum objects.

Firstly we played with 3D scanning.   In the next blog post I’ll talk about the process of 3D printing.

We wanted to look at different ways we could scan and create a 3D mesh of a light bulb.  We have tried two main ways of scanning, firstly using easily accessible and relatively cheap (in this case free) technology using 123D Catch and then having a go with a NextEngine.

123D Catch is a free application from Autodesk that enables you to take a series of photos and turn them into 3D models.  We used the handy iPhone app.  It works by taking multiple digital photos that have been shot around a stationary object and then submitting those photos to a cloud based server for processing.  The images are then stitched together to produce a 3D model.

NextEngine is a desktop 3D scanner which captures 3D objects in full colour with multi-laser precision.

We knew a light bulb wasn’t going to be easy to scan because scanners don’t tend to like transparent, shiny or mirrored objects.  But we thought we’d have a go anyway.

Scanning transparent objects in practice

Before we experimented with some of the historical science and engineering collection, we used a normal every day bulb to see what worked and didn’t.

Firstly 123D catch

And now NextEngine

IMG_8366

As you can see the fitting shows up pretty well, but the glass bulb itself really doesn’t work.  And 123D Catch has gone completely funny.  So after a bit of googling, tweeting and advice from the 3D pros at UCL we decided to try and disguise the transparency with talcum powder.

So here are the versions with talc.

123D catch

NextEngine

IMG_8377

Which amazingly worked!

After checking with conservation and we decided to cover the historical lightbulb in talc and try scanning that and here are the results:

123D catch

NextEngine

IMG_8450

The NextEngine scan is pretty good. It doesn’t quite capture the peak at the top of the bulb, but it isn’t a bad representation.  I don’t think we can quite call it a replica but it definitely looks like a lightbulb.

Obviously covering a hisitorical object in talc throws a lot of questions up about how museums could utilise 3D scanning if they have to cover delicate and fragile glass objects in powder to get a adequate scan. We’d be really interested to hear if anyone has found a more conservation friendly way of dealing with transparent objects without having to coat them in talc.

It also brings up questions about how accurate 3D representations of museum objects should be.  Should they be identical? Or is an approximate object acceptable?