Stars in our eyes

Since I started writing this blog I’ve found it hard to find things to talk about. To overcome this I’ve done something I haven’t done in a while, I started reading the news. Not the local paper or even the Australian, I spend a lot of my time online so finally used Google News to my advantage to find out new things about science. While I haven’t been able to make posts about most of the interesting things I’ve come across, yesterday something piqued my interest and I couldn’t help but just go with it. I’ve wanted to write a blog post about spacey stuff and now I’ve finally got my chance.

Hubble in orbit

Image of the Hubble Space Telescope. Credit to NASA and STScI

The article that caught my attention was about an image from the Hubble Telescope. Now, I’ve heard of the Hubble Telescope, but to my shame I had always imagined it to be just some gigantic telescope in an observatory somewhere looking out into the sky. Instead what I’ve learnt that it is in fact a telescope that orbits around the Earth, taking pictures and relaying them back to us here on the ground.

Star-Forming Region LH 95 in the Large Magellanic Cloud

Image of the Star-forming region LH 95 in the Large Magellanic Cloud, taken by the Hubble. Source:

It begs the question, why space? What’s wrong with the telescopes we have here on Earth? Well it turns out that Earth’s atmosphere can interfere and distort what we see through the telescopes here on Earth. Moving pockets of air in the atmosphere are one problem associated with this and the other is that the atmosphere can absorb or block out radiation from stars or other space objects, the different types of emitted radiation being the best way for scientists to study the space object. This is what happens with the greenhouse effect, our atmosphere blocks out the harmful UV radiation, or at least tries.

So how does this telescope work? As light comes into the telescope it hits one mirror, the primary mirror, and is then reflected towards a second mirror, which focuses the light into the instruments that sit just behind the primary mirror. This diagram is probably the best way to visualise it. The Hubble telescope has six different instruments that look at different types of light to gain different information, all except the ‘Fine Guidance Sensors’ which is used to keep the Hubble orientated the right way and also to measure the distance between stars.

Fun fact – when the Hubble was first sent up into orbit, scientists found that the quality of the images they were receiving was much lower than expected. This was due to a hugely minor flaw in the primary mirror that it was slightly the wrong shape, enough to cause the light to be focused in the wrong spot and distort the image received. This was fixed by installing a corrective optics instrument that, as the name suggests, corrects the images being seen by the other instruments. At this stage the instruments in the Hubble all have built-in corrections so the corrective optics instrument was removed on a servicing mission in 2009.

Hubble Telescope.

Image of Astronauts on the final servicing mission of Hubble in 2009. Image credit to NASA

Without getting into too much detail about each of the different instruments and how they work (though it’s pretty fascinating stuff) I’ll just mention that each one has a job to do in helping scientists learn a vast amount more knowledge about things across our universe, even remotely far away. If you would like to read about each of them here’s the place to go.

The Hubble is controlled by being given directions and commands via satellites from the engineers on the ground. This is also how we receive the data back, the Hubble sends the information it gathers back as data to the satellites which transmit it to the scientists on the ground. Similarly to the Australian Synchrotron, scientists who want to use the Hubble have to present their proposals to a review committee and then the ones that they believe will make the best use of the Hubble’s capabilities as well as addresses the most pressing astronomical questions gets the green light. Out of the approximate one thousand proposals that are reviewed each year, only about 200 of them are picked.

Hubble Observes Infant Stars in Nearby Galaxy

Image from the Hubble of a star-forming region in the Small Magellanic Cloud. Source:

Unfortunately the Hubble won’t last forever, its expected lifespan has already been extended with various servicing missions to it however its different parts will eventually degrade and cause the telescope to stop working.

But wait! A new telescope is in production to take over from the Hubble, it’s called the James Webb Space Telescope (JWST) and is looking at a 2018 launch date. It seems that we humans have come a long way since the days of believing that the Earth was the centre of the universe, but we’re still looking to the stars.

A Perfect Storm of Turbulent Gases in the Omega/Swan Nebula (M17)

Image of a “storm” of turbulent gases in the Omega/Swan nebula (M17). Source:


The other Franklin scientist

Today when I opened up google I was pleasantly surprised to see that their doodle for today (yes, that’s actually what those picture thingies are called, though I’ve spent god only knows how many hours playing the pacman version of their interactive ones) was commemorating the birthday of Rosalind Franklin. While I had heard about DNA all through high school, I don’t think I had ever heard of Rosalind Franklin until doing science at university level.

Understanding the structure of DNA, how the molecules connect together and fold up, has been an important scientific discovery and stepping stone towards learning about genetics and how we inherit DNA from our parents. It was James Watson and Francis Crick (and less famously Maurice Wilkins) who won the Nobel Prize in Physiology or Medicine in 1962 for proposing that there were two chains in the DNA held together by the nucleic acid base pairs in the middle, how they bond together and that it is all in a double helix shape.

How they first started coming up with this theory however is where Rosalind Franklin comes in. Rosalind Franklin had taken X-ray diffraction images of DNA that first implied that the DNA was in a helical structure, in particular and as most people know it, as a double helix. X-ray diffraction involves passing x-rays (non-visible light at a higher energy than visible light) through an object, causing the light to bend and make patterns due to the shape and size of the object. Rosalind Franklin had worked with x-ray diffraction quite a bit and was able to interpret from this picture the helical shape.

Image from wikipedia

Watson and Crick had previously built a proposed model of the DNA structure that had three strands instead of two, and was inside out compared to the final model. However when Rosalind Franklin and Wilkins, her colleague, visited Watson and Crick to see they model, Franklin pointed out many flaws in their model and so they were told by superiors to discontinue their work with DNA.

It was later on when Wilkins showed Watson the x-ray diffraction image by Rosalind Franklin of the DNA without her knowledge that he and Crick were able to develop a model of the proper structure of DNA.

Franklin was beaten to the discovery of the structure of DNA however it can be seen that she did play an important role in Watson and Crick’s development of their model. Unfortunately Rosalind Franklin became ill and passed away from ovarian cancer (aged only 36) in 1958 before Watson, Crick and Wilkins received their Nobel Prize for the work and Nobel Prize cannot be awarded to someone after they have passed away.

Diffraction 006

X-ray diffraction image from the lightbulb filament from Royal Institution

(Ironically, Rosalind Franklin’s intense work and exposure to x-rays may have been a possible cause for her cancer.)

Missing out on her Nobel Prize and the way Rosalind Franklin was treated by male scientists at the time has caused Franklin to become something of a feminist icon and there have also been multiple books and biographies written about her, telling her side of the story.

Written in the DNA

Hey guys, now you may or may not remember but I mentioned a while ago that I did my honours in Forensic and Analytical chemistry and this is what my whole undergraduate degree was about as well. Unfortunately at the time I was going through the course it was still developing and there was not as much forensic related topics compared to analytical chemistry as there is now, however I did learn a lot and would like to share an interesting topic that learnt about in my third year.

This topic was called forensic biology and in this topic we were introduced to the technique called Low Copy Number (LCN). Now LCN is a form of DNA profiling however it is not the one most commonly used. DNA itself is split up into two categories, coding DNA and non-coding DNA. Coding DNA is where the information for our genes is kept, the stuff that makes our bodies function correctly, whereas non-coding DNA is made up of information that at this point does not seem to have a function.

double helix cupcake

Image from juni xu

Here’s the kicker though, because coding DNA is highly important to the body’s function it means that for every person this DNA would be very similar. If mutations in the DNA were to occur here it could affect the body’s functionality in a very bad way. Non-coding DNA however doesn’t have to worry about this, it doesn’t affect the body’s functions so there would be very little adverse effects to mutations in this area. This being the case, it means that non-coding DNA can differ greatly from person to person and here’s where DNA profiling comes in.

In the non-coding DNA region there are spots common to everyone where a sequence of 1-6 DNA base pairs are repeated multiple times, known as short tandem repeats (STRs). Because of the high variability of the number of repeats at these spots between people, it is then possible to compare two DNA profiles to see whether they have the same STRs or not at the known markers.

In order to look at these DNA profiles properly the STR signals are amplified by a process called polymerase chain reaction (PCR). With this method a sample of the DNA is multiplied many times with the use of enzymes and cycles of heating and cooling to replicate DNA. Below is a youtube video explaining the PCR process.

Interesting stuff but what’s all this got to do with LCN? Well with DNA profiling and PCR they run about 28 cycles of the PCR in order to amplify the signal enough to have a characteristic STR profile. Sometimes however there may not be enough DNA physically present, such as for crime scenes, and so the profile that is created is incomplete or unusable.

LCN is a way of making the procedure more sensitive by increasing the number of PCR cycles to 34 when there is less DNA available, so that a better profile can be obtained. I came across an interesting article where LCN was used in four different cases to help identify remains that were about 40 to 60 years old or even older (unfortunately the article is not open access but if you are interested in reading it please contact me).

It sounds like this is a good idea, if you can get a more sensitive technique why isn’t everyone using it? Well it turns out there’s some limitations and concerns to LCN, including the reproducibility of the results if there was only a small amount of DNA to begin with. In a case in England (warning, the case details contain adult material) there was an appeal to dismiss the DNA profile obtained through LCN as the appellant argued there was not enough of the accused DNA to be used according to previous cases. It was discussed by expert witnesses that while the quantity of the evidence was relevant, it was the reliability of the results and interpretation that was more important.

DNA lab

Image from snre

This has more to do with the type of profile that is obtained from the LCN, as the peaks in the DNA profile may be influenced by instrumental errors (such as allele dropout, stutter, noise, contamination) due to the small amount of DNA used. With practice and experience it makes it easier to identify whether the peaks have been influenced by errors but in terms of using the evidence in court it still raises issues, especially where life sentences are concerned.

I think there’s no right or wrong answer for using LCN in cases, there are both merits and issues with the technique but if further research is done in the area it would be interesting to see it used more frequently.

Time for the bass to drop, I mean pitch

So the other day something amazing happened, the pitch dropped and was caught on video! What does this mean? Well at first I had no idea, except that a lot of people on the internet were making a big deal about it so I thought I’d check it out and find out what all the fuss was about.

Firstly, what is pitch? I’m not talking about music pitch, where to drop it would be lowering the way you sing. What I’m talking about is the sticky tar looking stuff used for bitumen or asphalt. It can be made from petroleum products or plants and is a hugely viscous substance, think of an extremely thick honey.

Bitumen glue

Image from markhillary

So that’s what pitch is, not very exciting in my books, so what’s everyone talking about? Well it turns out in 1927 there was an experiment set up, here in Australia, by Professor Thomas Parnell at the University of Queensland to show students about the fluidity and viscosity (how thick the liquid is) of pitch. That even though it looked like a solid at room temperature it was actually an extremely viscous liquid (many many times that of water).

Parnell warmed the pitch up, poured it into a glass funnel that was sealed off at the bottom and then waited three years for the pitch to completely settle out evenly before cutting the sealed stem of the funnel. The experiment takes into account the physics of how liquids flow as well as the effect of temperature on the pitch itself to calculate the viscosity of the pitch. With gravity, the extremely viscous material slowly drips out of the funnel and the dates of when a drop falls is recorded, as this is on the scale of decades (the last drop being back in November 2000).

There is a live feed for the UQ experiment to catch that next drop getting ready to fall yet still, unless you’re a physics lecturer or enthusiast this isn’t very invigorating stuff. Where it does get unique though is that no one has ever captured the drop of pitch on camera, until now! (unfortunately due to technical issues they could not film the last one)

A similar experiment was set up in 1944 at Trinity College in Dublin and on July 11th they captured one of their drops on video.

This pitch-drop experiment is the world’s longest continuously running experiment and has won Parnell and Professor John Mainstone, the long time custodian of the experiment, an Ig Nobel prize. Ig nobel prizes are a slight parody on the nobel prizes but best described by creator Marc Abrahams as “honouring achievements that make people laugh, then make them think.”

Fun fact – the term ‘pitch black’ actually comes from this type of pitch, due to its extremely dark color.

One downside to the UQ experiment however is that as the experimental conditions were not controlled at the start (the set-up is now in a display cabinet), the viscosity calculations are only estimates as the flow can change with the temperature seasonally. But still it must be admitted, you’d need a lot of patience for an experiment this long!

Public speaking, or, I’d rather die first

The other day, as part of my communicating science topic I am undertaking at Adelaide Uni, I had to give a two minute talk reviewing a journal article of my choice. Now, like most people, I absolutely HATE getting up in front of people and having to give a presentation. However I do realise it is a necessary evil that exists, especially for scientists who wish to share their research with others.

Public speaking
Image from brainpop_uk

Now there are many types of anxiety disorders that people suffer from every day, but what I am referring to here is the nerves we get when we have to stand up in front of a group of people to give a presentation (similar to stage fright). We even discussed this in one of our workshops, ways to try to counter-act the nerves. The one that struck with me the most was remembering that what is actually happening is a scientific process.

In this post I’m going to be talking about this scientific process, what’s actually happening in your body that might make public speaking feel so terrible. To quote Heather Bray, one of our lecturers for this topic, on public speaking, “Your body thinks it’s going to die.” It’s literally true, you get up to start your talk and you’re afraid so your body sends you into fight or flight mode.

This fight or flight mode tells the brain to release adrenaline, a chemical that goes along and tells other parts of the body to do things. We’ll start with the heart, releasing adrenaline causes your heart rate to start to increase pumping more blood around the body and increasing your blood pressure as well because hey, something bad is about to happen and your body wants to be fully prepared for this!

kitten hug
Presentations are stressful stuff, he’s a comforting image from rodrigotrovao of a kitten to help

The lungs also want to get in on the action, your body is going to need oxygen in case of fight! or flight!, so your breathing increases and the passageways between your nose/mouth and lungs open up to let it in. Your body will also need fuel to deal with this so the adrenaline tells your pancreas to stop releasing anymore insulin. When insulin levels are low your body makes more glucose in the muscles and liver, which is then used as the fuel.

In order to make more fuel, the body also starts to break down the triglycerides and lipids in the body’s fat stores, while the adrenaline also causes muscle contraction. Ever needed to suddenly go to the bathroom right before a presentation? It’s likely because the muscle in your bladder wall that controls the storing and release of urine is contracting, telling your body it’s a go!

With all these things going (as well as extra secretion from the sweat glands) it’s not suprising that you might be in a little turmoil when giving your talk. Lots of people have come up with different ways to combat the nervous effects they feel, unfortunately they don’t always work for everyone but to finish off I’ll share one with you that I found to be a little be entertaining as well as useful.

So remember, if you’ve got a talk coming up do the penguin.

Fish oil: Hook, line and sinker

Fish oil capsules (tran, trankapsler)

Image from jcoterhals

Following on from my last post I thought I’d talk about a research project I did as part an undergraduate topic. It was about, of all things, fish oil. Now, I don’t find fish oil to be a very interesting topic, at least I didn’t until I did this research project.

First of all, what is fish oil? What’s it all about? Apart from getting it straight from the source (i.e. eating fish) you can find fish oil tablets in supermarkets and pharmacies to take as supplements. Fish oil is made up of omega-3 fatty acids. We’ve been told that omega-3 fatty acids, or omega-3’s as they’re more commonly known, are great to consume for health benefits.


Image from e-wander

Without getting too much into the chemistry of what a fatty acid is (there is a bit of chemistry jargon but please bear with me), it’s a long chain of carbons and hydrogens that looks something like this (but usually a lot more carbons in the chain):

fatty acid chain

(An awful looking example of a fatty acid by me)

Or to make it simpler to visualise (where most of the letters have been removed but the structure is exactly the same):

fatty acid chain simple

(A much nicer looking example of a fatty acid by me)

The double lines indicate that there is a double bond and for different fatty acids you can have either one double bond present in the fatty acid, multiple double bonds present, or no double bonds at all. When you look at the backs of food packages to see the fat content they usually state the saturated fat content as well as the polyunsaturated and monounsaturated fat content.

fa content edit

Image from me

What they are literally talking about is whether there are fatty acids with no double bonds (saturated), many double bonds (polyunsaturated) or only one double bond (monounsaturated). In general it’s been told that saturated fats are the ones to lessen in your diet and I’ll explain why.

If there were no double bonds in the fatty acid it would look like a long straight line instead of the bendy chain above. This means that it would be easier for lots of those fatty acids to stack up on top of each other like this:

saturated fas

Image of stacked up saturated acids by me

This is what we don’t want to happen, because the fats like this won’t want to move away from each other. However if you put even one double bond in the mix, so have bent monounsaturated fat, they don’t sit as well on top of each other and can be broken apart more easily and broken down in the body. Add more double bonds like in the first example, polyunsaturated fats, and again it’s harder for the fats to stack up.

With all this in mind let’s head back to fish oil, which we know contains omega-3 fatty acids. The two main fatty acids in fish oil are, with the horrendous names, Eicosapentaenoic acid (EPA) and Docosahexaenoic acid (DHA), and are named as omega-3 fatty acids because on the third to last carbon on the long chain they have a, da da dun, double bond!

EPA edit

EPA with the lovely double bond highlighted by me

DHA edit

DHA, again by me

As we can now see from these omega-3 fatty acids compared to the saturated fats, it would be much harder for them to stack on each other, and that’s why they are much better for us than saturated fats (as well as other reasons).

Now right at the beginning I said I’d talk about the small research project I did and I haven’t forgotten. The aim of the project was to look at the shelf life of fish oil supplements as over time it was thought that the omega-3 fatty acids might suffer from chemical, heat and UV light damage as well as degradation over time.


Image from here

These could potentially cause those double bonds I’ve just been telling you about to become oxidized, which just means that the double bond would become a single bond, which we of course do not want.

As we only had a short amount of time, we had to simulate shelf life conditions by heating up samples of the fish oil tablets for different amounts of time. We then used NMR (1H NMR), a technique that measures unique hydrogens in a sample, to look at whether the double bonds in the omega-3 fatty acids were changing to single bonds. If they were changing, then there would be different hydrogens detected as a single bonded carbon in the fatty acid has two hydrogens whereas a double bonded carbon has one hydrogen.

NMR, 600-MHz

An NMR instrument from EMSL

We heated samples at 80°C for 5, 30, 60, 120 and 240 mins, but when we looked at the NMR results for the 5 and 60 minute samples there appeared to be no difference to an untreated sample in what hydrogens were detected whatsoever. As the 240 min sample yielded the same result we heated an additional sample for 16 hours at 100°C, again with no observable difference.

fish oil

Image of fish oil supplements we used (by me)

I don’t know about you, but I don’t keep my fish oil tablets in an 100°C cupboard. So we can determine that those awesome omega-3 fatty acids sitting in your cupboard are likely to still be good for a fairly long time, at least where temperature effects are concerned.

To finish I’d like to thank Kraft Crunchy Peanut Butter for being extremely useful for this post as well as damn tasty.

Mixing Science and Art

Today I thought I’d tell you guys a bit about the research I did for my honours year. Just a bit of background, I completed my honours in 2012 in Forensic and Analytical Chemistry, with a large focus on the analytical chemistry.

Chemistry Lab Yellow

Image from klar_rocks

Going into honours I asked myself, what’s something I really love to do? The answer here was making art (or drawing and designing). Unfortunately you can’t do an honours project in science by just making art so with the help of my supervisor we came up with something equally as exciting, using an analytical technique to look at something art-y.

Ferrofluid 2

Image of ferrofluids by AMagill

Science and art have mixed for many years now. In order to improve conservation and restoration techniques there has had to have been some science and research involved. Whether it be to find out what types of materials were used in producing a sculpture or painting, or determining whether a painting is a fake, science has played an important role in how we’ve addressed those issues.

Going back to my own research, I chose an analytical technique called Portable X-ray Fluorescence (PXRF). This technique uses low level x-rays (I regularly got asked if it was dangerous, my reply would usually be it’s perfectly safe as long as you don’t stick your arm in the way of the beam, which is true) to determine what elements make up the sample that is being tested.

vermilion pxrf spectrum
Image of example PXRF Spectrum from me

My own bias aside, PXRF is a useful technique for a range of different aspects including (as the name suggests) its portability, it runs on batteries as well as a power cable so you can take it to whatever it is you want to analyse instead of having to bring a sample to you. It’s also a relatively cheap instrument to run and you can get sample data in a matter of minutes.

The best advantages however are that you can just test a sample as it is, so you don’t need to prepare it in some way, and it’s a non-destructive technique. This means it doesn’t harm or alter your sample in any way and that makes it ideal for looking at priceless paintings or artifacts.

bruker tracer iii-v+
Image of a Bruker Tracer III-V+ PXRF instrument from me

PXRF does have its limitations though as it can’t detect elements lower than magnesium on the periodic table (this can vary slightly between instruments) which leaves out a lot of organic components that might be important. In terms of looking at paints and paintings, the organic components could be the binders used in the paints or some newer synthetic pigments that have been developed.

Since this technique seemed quite good for looking at paint I decided that would be a way to structure my project, however I would need to bring something unique it. Yes PXRF analysis of paint had been done quite a lot before, but only qualitatively. Therefore the aim of my project was to investigate the quantitative analysis of paint using PXRF.

Image of different powdered pigments from Nathan Stang Photography

To do this I focused on pigments that were commonly used in the renaissance era such as azurite, vermilion and orpiment. In addition to this I used titania as it was a very cheap and common pigment to work with whereas the others were a little more expensive since they are much less commonly used for painting now.

I learnt how to make paint from some lovely people at the Artlab Australia using my chosen pigments and linseed oil, a binder commonly used in the renaissance era. (Fun fact, one of the reasons it’s not used very much anymore is because linseed oil rags have the possibility of catching fire if not disposed of properly. I can safely say that I never had to experience this myself but I undertook some extensive safety precautions in order to do so.)

IMG_3484 PAHG001
Image of vermilion paint film by me

Once I made the paint I made many, many, MANY samples using different amounts of pigment to binder, different types of canvases and different number of layers of the paint. I also looked at just the pigment powder by itself by pressing the powder into a disc.

azurite disc orpiment disc

Images of orpiment and azurite pressed discs by me

When I got to the quantitative analysis part though, I soon discovered that with the software that came with our PXRF instrument it was almost impossible to do with my paints as we just didn’t have the right standards. Working with what I had, I then looked at doing a semi-quantitative analysis however with time constraints I just wasn’t able to obtain the best results.

I was able to do a very small investigation into analysing a sample I made with different types of paint layers and determining the thicknesses of the different layers (it involved a lot of formulas and maths which I won’t bore you with, but if you are interested just ask), however again I could only do a qualitative and semi-quantitative analysis of the data.

layered paint

Image of paint sample with multiple layers by me

After everything was done and dusted I felt like I’d learnt more about what NOT to do when it came to research, which from what I’ve heard is one of the big parts of the honours year. It appears to be part of the process of science and research, something might sound really great and wonderful but simply not work under certain conditions, and it’s how you deal with that that pushes you to go further and keep learning.

I’ll end this post with a memento from my honours experience, as I was lucky enough to get a picture of my desk at the beginning of the year before everything started.

before-after honours