First image of a black hole:
https://www.bbc.co.uk/news/science-environment-47873592
Some background:
https://youtu.be/pAoEHR4aW8I
First image of a black hole:
https://www.bbc.co.uk/news/science-environment-47873592
Some background:
https://youtu.be/pAoEHR4aW8I
Last edited by Franco; 10th April 2019 at 17:47.
Whilst it’s a great achievement, I was left somewhat disappointed by the whole affair.
Listening to breakfast news this morning, apparently they used eight telescopes to collect data, collated it and created a virtual photo.
So what they’re really saying is “here’s a sketch of what we think it looks like!”
Also, if the black hole is surrounded by bright light on the edges of the event horizon, why can we see a black disc in the middle? Surely all we would see is a ball of light similar to a star?
I’m no physicist and I’d really like anyone who is and can explain it to me in layman’s terms, to do do if they’d be kind enough.
That can actually be explained - the mass (i.e. the light) is swirling around the 'black hole'. Think of it like rings arounds Saturn. I believe matter on the 'ends' get flung out straight.
Edit - see this Guardian article, a decent graphic - https://www.theguardian.com/science/...e-breakthrough
the way they built the back hole for interstella (from a mathematical viewpoint ) gives a good explaination .
https://www.youtube.com/watch?v=MfGfZwQ_qaY
the image posted by the BBC looks like a low res image due to the constraints of distance and our current technology
Gotta say, I shouted "potato camera" at my screen when I saw the reveal.
I found this quite useful as well - has some 3d visualisation of the black hole.
https://www.youtube.com/watch?v=l29wCKkQpMg
That's basically how a digital camera works. Say the Event Horizon is an 8 pixel sensor, which moves position every minute of every day it ends up being a maybe 64 pixel sensor, or more much much more. Either way like a digital camera sensor it measures light waves, or radio waves in the Event Horizons case and produces a ton of data, which in turn is processed through an algorithm and a visual representation of the data appears. Adding more telescope around the world and linking them to Event horizon would mean adding more pixels to the sensor, ultimately it would give you a higher resolution image.
So it's more like 'here's an illustration of what it looks like based on the light measured from it', which can be said of any digital image whether it's from a radio telescope or an iPhone.
I cant visualise how it can be the same.
My chocolate Lab is sat on the bottom of the sofa and I'm looking at her. The image I capture on my iPhone is exactly the same image my eyes see.
What they said here was they correlated the data collected from the cameras and created a visualisation of how they think it is. How can that be the same?
Radio waves compares to light waves – data “conversion” from one form into the other.
Given the vast distance involved, any optical telescope (which doesn’t exist) able to let us see it, would be showing us what it looked like millions and millions of years ago, not what it’s like presently.
Latest copy of the image in focus..........
Started out with nothing. Still have most of it left.
In a physical sense what you're seeing and what your phone screen displays isn't the same at all. Almost every aspect is different to some degree. What you're seeing is a full size 3 dimensional pair of moving images that absorb and reflect light, made up of billions of colours and textures. Your phone screen shows a 2D representation of that image, that's back-lit in millions of colours that's a fraction of the size, based on the digital data your iPhone camera sensor picked up in the split second the shutter opened. Your brain interprets the two as similar.
You could have a 8 different digital cameras pointed at your Chocolate Lab and none of the images will be exactly the same, as each lens, sensor and processor will interpret that data slightly differently, but it'll still be a picture of a chocolate lab. And that's the same with the Event Horizon, they could have 8 different radio telescopes, in 8 different locations across the planet and the image will be slightly different, yet still reading the same data it would interpret the result and you'd get a slightly different yet very similar image.
Took this photo of Bigfoot on the way to work this morning, well actually not the real Bigfoot but I reckon this is what he'd look like
'Against stupidity, the gods themselves struggle in vain' - Schiller.
This is fascinating stuff although I do feel a bit down when I'm struggling with an IKEA flat pack and they're out there detecting black holes! Lol.
Sent from my SM-G950F using Tapatalk
A picture of a black hole is something I never thought I’d see as a Physics student in the ‘80s, so I think it’s brilliant.
Einstein didn’t dream up black holes entirely on his own.
Radio waves tell you what an object looks like, but we can’t see them with the naked eye, so they have to be translated back into something we can see, which is visible light. It’s still a picture of a black hole just like an electron microscope picture or a low-light camera image.
The pictures are sewn together, (or stitched), not sown.
The ‘back hole for interstella’ sounds like a prop from a porn film.
True, that can crop up. :)
I reckon the human visual system is pretty good. Better than a typical digital camera for most human needs e.g. binocular, motion sensing, colour resolution, range of sensitivity to light (optimised to certain wavelengths/colours).
Not good for looking at a black hole though.