About tamelarose

PhD student in astrophysics at the University of Cambridge. An Oregonian at heart!

Jets from black holes: a violation of physics?

Here’s a commonly asked question when it comes to learning about active galaxies:

How can jets launch from black holes when not even light itself can escape?

blackholebinary3Credit: NASA

The short answer is that the issue arises not with physics but with semantics. It’s very easy for astronomers to say that the jets launch from the central black hole, but what they really mean is that the jet is launched from a region very near the black hole. Since the jet is powered by accretion onto the black hole and since, on the scale of the whole galaxy, the black hole and surrounding accretion disk are essentially pointlike, then it’s easy to simply say the jets start at the black hole. But don’t worry, no material is flowing out from within the black hole’s event horizon so physics is not violated!

But the bigger question touches on some very active research areas at the moment:

What causes these jets to form?

This is very much still being debated within the astronomical community, made difficult because these jet-launching regions are so small and far away that we simply don’t have telescopes powerful enough to resolve an image. So we can’t directly see what’s going on. Instead we try to work things through from physical models and computer simulations. But that’s complicated because there’s a lot of turbulence, magnetic fields, and energetic fluids swirling around on a very small scale, and all of that gets messy very quickly.

So, more work still needs to be done, both observationally and theoretically.

But there is a general picture that is emerging:

A whirlpool of material surrounds and slowly feeds the blackhole. This is called the accretion disk.

artist_accretion_disk

Credit: NASA/Dana Berry, SkyWorks Digital

Accretion disks can have strong magnetic fields which you can imagine being twisted into helix along the axis as the disk rotates.

m87.diskCredit: NRAO and the Space Telescope Science Institute

We think this helical magnetic field can sweep up some of the charged material from the disk before it falls into the black hole, and propel it away from the centre in the form of a jet at near the speed of light! If you want, picture a very strong firehose blasting through space. But this analogy doesn’t do justice to these extraordinary machines. The jets are simply spectacular in their sheer power and size.
The following image shows a very nearby radio galaxy, Centaurus A, superimposed on a foreground of Australian radio telescopes and scaled to the correct angular size. So if we could suddenly see in radio light, this galaxy and its jets would dominate the sky!

cena_csiro_1063Credit: NASA/APOD and Ilana Feain, Tim Cornwell & Ron Ekers (CSIRO/ATNF);

Massive black holes aren’t the only places were we find powerful jets, and I think this fact is key in gaining understanding of how the jets form.

We see jets from newly-forming stars (called protostars), jets from binary star systems, jets from pulsars, jets from gamma-ray burst events…

A new Chandra movie of the Vela pulsar shows it may be "precessing," or wobbling as it spins.The Vela Pulsar Jet. Credit: Chandra X-ray Observatory

All on different scales, but all exhibiting the same sort of thin, powerful jet. Thus chances are, the physics is the same in all cases. Some universal engine is triggered when heavy accretion processes occur. The key is to compare jets on all these scales and look for the common physics that works in all cases. It’s very much a work in progress, including in my own research!

As a final note, this question and further details about accretion disks were discussed by myself and other astrophysicists in a recent Naked Scientist podcast: http://www.thenakedscientists.com/HTML/podcasts/astronomy/show/20130525/

Advertisements

Ancient Coastlines and Modern Politics

Image from New York Times

Just a quick post to draw attention to a very interesting article by Deep-Sea News that relates the location of a Cretaceous-era coastline to current political leanings in the deep south. Article here: ‘How presidential elections are impacted by a 100 million year old coastline

There’s a belt of counties through South Carolina, Georgia, and Alabama that consistently votes democratic in recent history, and through this article, is plausibly linked to the soil-enrichment that a 100-million year old coastline provided. Pretty fascinating!

Organic Fashion

The BBC just published quite an intriguing article about new technologies being integrated into the fashion world:

http://www.bbc.com/news/technology-17551859

A bomber jacket made from a biological ‘leather’. Courtesy of the BBC article and Biocouture.co.uk

While many people are looking at integrating digital technologies into fabrics and clothing designs, I found the highlight on grow-able fabrics even more interesting.

Suzanne Lee is a fashion designer in London who is investigating methods in which a biological ‘skin’ from a kombucha-like process can be dried and molded into fabrics and 3D clothing. Since the material is completely biological, it can simply be composted after a few years with minimal total environmental impact.

While still in the exploratory phase, I think it’s a pretty cool idea! The leathery-like material that results looks unique and edgy when sewn into jackets, and the 3D molding process could be great for making innovative shoes, hats, or other structural fashion pieces.

Depending on the material’s strength I imagine it could also be effectively used for umbrellas, marquees, etc where improving the compostability of these common and often disposable objects could make a big impact.

However, Suzanne does note that she can’t yet make a water-resistant version, so I guess the umbrella is out for now!

Here’s Suzanne’s TED talk: http://www.dailymotion.com/video/xljapx_ted-talk-suzanne-lee-grow-your-own-clothes_lifestyle and webpage: http://www.biocouture.co.uk/ for more info.

 

Venus Transit… from cloudy Cambridge

The recent transit of Venus across the Sun marked the last chance any of us will have to witness such a rare crossing event. Very sadly, I was only able to do this virtually via the NASA live webcast and the Astronomy Picture of the Day site, which refreshed its solar image every 15min during the transit, courtesy of data from the Solar Dynamics Observatory. Incidentally, the APOD site today published a remarkably detailed image of the transit: http://apod.nasa.gov/apod/ap120607.html.

In Cambridge, we’ve been experiencing distinctly un-June-like weather for the past week and half, and at 4am when the UK had the chance to catch the tail-end of the transit as the sun rose, we continued to have cloud.

However, clouds do not apparently mean all is lost. One of my favorite photographs to emerge from the 5th/6th of June was actually taken not far away, in Oxford.

http://andrewsteele.co.uk/photography/501715/?tag=tov2012

I think the heavy clouds in the photo add a great deal of drama to an already magical event. Kudos to Dr. Andrew Steele for the picture!

Processing HTML using Python

The other day I found myself with a list of several hundred galaxies morphologically classified as either compact or uncertain in a 2010 paper by Gendre et. al. The flux contours of these galaxies are particularly small and the object is often unresolved by the FIRST VLA survey. But I wanted to get an upper limit on their physical size by querying the FIRST catalog for their calculated deconvolved major axis diameters. This arcsec measurement could be converted to a projected linear size in kpc if the redshift was known. But of course I didn’t want to type in the position coordinates for each source individually! So I wrote a python script that would access the FIRST catalog search webpage, fill in the RA and DEC for each of my objects (provided in a text file list), and return the major axis diameter of the closest match. In the process I had to learn about parsing HTML pages using a python module called  Mechanize and the BeautifulSoup package. This sort of process could be extended to all sorts of work- and personal-related tasks that involve pulling information from online content. I found the introduction by Weekend Codes to be particularly helpful in getting started.

So this is the .dat file I had to work with. In total it contained 322 objects. The first six columns containing the right ascension and declination coordinates for each object are the only data I use to query the FIRST search form. The rest of the information just helps me keep track of the object’s name, type, redshift, flux, etc.


So one by one I wanted my script to take the position information from the list, fill in the online form and return the result. If done manually, this is the returned page from one queried position coordinate:

Here I’ve selected a search radius of 15 arcsec and for this particular object, the FIRST catalog returns two matches. I’ve also selected the text output format, which opens a new window containing just text. I’m interested in the deconvolved major axis for the closest match (so in this case, a 10.45 arcsec diameter for a search distance of 0.5 arcsec). This value gives a good first approximation to the true source size, and comes from fitting a Gaussian elliptical to the object, after removing the beam information from the map (a process known as deconvolving).

So let’s create script to automate this process!

First I downloaded and installed the Mechanize module (to access webpages and fill the online forms) and BeautifulSoup (to ease the parsing of the HTML). See my tutorial on installing modules and packages here.

Now we’re ready to go. Begin a new python script (i.e. name.py) with:

import sys
import string
import mechanize
from BeautifulSoup import BeautifulSoup

And define the input and output files:

sourcefile='input_file_name.dat'
outfile='output_file_name.dat'

This next part is copied from the Weekend Code tutorial linked above. Essentially, we invoke Mechanize to emulate a browser.

# Browser
br = mechanize.Browser()

# Browser options
br.set_handle_equiv(True)
br.set_handle_gzip(True)
br.set_handle_redirect(True)
br.set_handle_referer(True)
br.set_handle_robots(False)

# Follows refresh 0 but not hangs on refresh > 0
br.set_handle_refresh(mechanize._http.HTTPRefreshProcessor(), max_time=1)

# User-Agent
br.addheaders = [('User-agent', 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1) Gecko/2008071615 Fedora/3.0.1-1.fc9 Firefox/3.0.1')]

Now, the output is going to be another text file with a list of objects and their returned major axis sizes, among other things, so let’s initialize that file with a header containing the titles of each column:

fileout = open(outfile, 'w')
fileout.write('Name RA_DEC Search_Dist/arcsec Majaxis_Size/arcsec \n')
fileout.close()

And now we read in the contents of the input file over a while loop and begin querying the webpage:

file  = open(sourcefile, 'r')
while 1:
  line=file.readline()
  #skip the header line containing the string 'Name'
  if 'Name' in line:
    continue
  if not line: break
  items=string.split(line)

  RA_DEC=items[0]+' '+items[1]+' '+items[2]+' '+items[3]+' '+items[4]+' '+items[5]
  ObjName=items[6]

  # Open the site we want to query
  br.open('http://sundog.stsci.edu/cgi-bin/searchfirst')

  # Select the first (index zero) form
  br.select_form(nr=0)

  # Fill in form. Note that this requires knowledge of what the forms are called.
  # See Weekend Codes for help with this.
  br.form['RA'] = RA_DEC
  br.form['Radius'] = '15' #a 15 arcsec search radius
  br.form['Text']= ['1'] #outputs in HTML (0) or Text (1)

  # Submit query
  br.submit()

  #read in the returned webpage and parse into a 'soup' using BeautifulSoup:
  html = br.response().read()
  soup = BeautifulSoup(html)
  #slit into strings separated by line
  txtsoup=str(soup).split('\n')

  #read string lines into a list 'data'
  data=[]
  for i in txtsoup:
    data.append(i)
  del data[0:14] #remove the 15 line header
  del data[-1] #remove empty last element

  #define new empty lists to which we'll append search distances and major axis sizes
  dist=[] #list of search distances, in arcsec.
  majax=[] #list of returned deconvolved major axis diameters, in arcsec.
  for n in range(len(data)):
    #split each line from data by single space ' ', which is default for split()
    dataline=data[n].split()
    dist.append(dataline[0])
    majax.append(dataline[11])
  tuplearray=zip(dist, majax)
  #convert tuples into lists so we can reference by index
  listarray=list(tuplearray)
  #sort listarray by shortest search distance, since they're not always in order on returned webpage
  sorted(listarray, key=lambda x: x[0])

  #create joined string separated by single space to print to outfile
  nextline=string.join([ObjName,RA_DEC,listarray[0][0],listarray[0][1],'\n'],' ')

  #open output file in append mode and write to file
  fileout = open(outfile, 'a')
  fileout.write(nextline)
  fileout.close()

The end result is an output .dat file (or .txt file if you prefer) that contains the name, position, closest search distance and corresponding deconvolved major axis diameter. I then feed this file into other scripts to convert my arcsec diameter into kpc using my known redshift, and plot the results into a radio power vs projected linear size graph.

This code could probably easily be made more efficient and cleaner, but it does the job and makes my work a whole lot quicker. The same principles could be used to query just about any webpage form, including online email access and the like.