1. a computer monitor is running at a resolution of 1600x1200 with 24-bit color.

the number of pixels = 1600x1200 = 1,920,000

one pixel's color is a string of 24 binary digits so
the number of colors one pixel can display is 2^24 = 16,777,216

now think of each pixel as a digit in a massive number
and think of colors as the base [like our own digits use base 10]

so the number of different images that this monitor can display is:
base ^ number of digits
=
16,777,216 ^ 1,920,000

i would like someone to either calculate that answer for me,
or tell me how many digits the answer is

and for the sake of fascination - this array of images, despite being mostly random noise, does indeed contain every image your mind can fathom, which includes every frame of every movie, every photograph taken or yet to be taken, of every person, at every age, from every angle. and dont forget there are pages just listing uncalculated digits of pi and un-discovered equations. WOOHOO the greatest database of all time can be rendered with the simplest algorithm

2.

3. about 16million digits. so if you start producing now, you may finish while there's still a tad of energy left in the last black hole....

It also contains every image of every part of the universe, please be aware some of the images may upset persons of a sensitive disposition...

4. I know this is an old thread but this is something I've thought about since childhood.

To make the experiment more feasible, one could limit the size and color palette. A set of every possible 16-color grayscale 500x400 image would be just as interesting an endeavor.

Another note on what it would include: Every page of text that ever was or will be produced, including every story that could ever be written and all faiths' religious texts, and technical specifications of technologies not yet invented.

There will also be a picture of me having relations with Natalie Portman. Seriously.

5. The number of 500x400 pictures with 4-bit grayscale would be 2^(4*500*400) = 2^800000 or about 10^240000, so each image would need a 240,000 digit ID number, if written in decimal.

Also, one very large, modern hard-drive can store 3TB or 3*2^40 bytes. Each image needs 100,000 bytes (and there's no point in compressing them), so one hard-drive can store roughly 2^25 picture. That means you'd need 2^799975 hard-drives.

Plus, you have no way of sorting them, so you'd have to wade through endless seas of snow to find anything interesting.

Even if you reduced it to just black and white (still many interesting pictures there) that'd only divide that exponent by 4.

There are a lot of interesting pictures in just an 8x8, black and white square with reflective symmetry between the left and right halves. There are just 2^32 such images, each needing only 8 bytes (assuming you store both halves for simplicity). Even that collection takes up 32 GB though.

6. Those particular parameters were just an example of some limitation, as opposed to the original poster who was calculating based on an entire HD color display.

Also I'm not sure why compression would be pointless but I could be missing something.

This was just a rudimentary primer on some remotely feasible version of the project. I don't have the math skills to come up which my version of a practical figure.

But a seemingly impractical set of parameters might still be workable, thinking in broader terms. An online distributed project and an open-ended timeframe might expand the possibilities. The images could be made available online as they're generated, with increasing speed as processing and storage technology improves over the years. Maybe a folding@home type app too.

Throw in an addictive web interface with points system, user level hierarchy, fast slideshow capability, rewards for significant finds based on community vote... even if the full set of permutations is never achieved, the project could still yield some fascinating results.

7. Sorry, but no. Even with your reduced parameters, 2^799975 hard drives would mean every person on the planet would need to buy roughly 2^299942 of them. (Exponents are like that.) There's just no possible way. For further comparrison, there probably aren't more than 2^300 particles in the universe, so there's not enough matter in the universe to build that many hard-drives.

Compression is pointless because it only works when you can declare some possibilities uninteresting. Basically, if you want the full set, the overhead in compression will make the set bigger, not smaller.

8. MagiMaster said: "Even with your reduced parameters..."

...when Equazcion already said, "Those particular parameters were just an example..."

MagiMaster said: "Basically, if you want the full set..."

...when Equazcion already said, "even if the full set of permutations is never achieved, the project could still yield some fascinating results."

MagiMaster said: "there probably aren't more than 2^300 particles in the universe"

...when Equazcion already said, "...with increasing speed as ... storage technology improves over the years..."

To sum up, problems with my particular parameters are irrelevant, problems with generating the full set are irrelevant, and particles in the universe are irrelevant since storage technology advances allow more to be stored on less matter. Also, ordinary jpeg image compression seems doable. Open your mind and don't be a pedant.

9. The problem is not the algorithm the problem is what to decied what is important.

10. Originally Posted by equazcion
MagiMaster said: "Even with your reduced parameters..."

...when Equazcion already said, "Those particular parameters were just an example..."

MagiMaster said: "Basically, if you want the full set..."

...when Equazcion already said, "even if the full set of permutations is never achieved, the project could still yield some fascinating results."

MagiMaster said: "there probably aren't more than 2^300 particles in the universe"

...when Equazcion already said, "...with increasing speed as ... storage technology improves over the years..."

To sum up, problems with my particular parameters are irrelevant, problems with generating the full set are irrelevant, and particles in the universe are irrelevant since storage technology advances allow more to be stored on less matter. Also, ordinary jpeg image compression seems doable. Open your mind and don't be a pedant.
You can't possibly think that we could store more than 1 bit on a single particle, much less that 2^799600 bits or so we'd need to store all that, even using every particle in the universe as storage. I feel as if I'm failing to convey the vastness of this number.

If you're not generating the full set, then you have a monkeys-at-typewriters problem. There's all kinds of problems with that too.

11. Just for some perspective.
2^10=1,024
2^50=1,125,899,906,842,624
2^100=1,267,650,600,228,229,401,496,703,205,376
2^799,600=WTF!

12. Actually, there is one specialized form of compression that deals with a complete collection like this: A De Bruijn sequence. Unfortunately, this only reduces the the size of the collection from 2^800020 bits to 2^800000 bits.

I forgot, 2^800000 was the number of pictures, each needing about 2^20 bits. A De Bruijn sequence would let each picture overlap in such a way that each would only take an average of 1 bit (assuming I didn't mess my calculations up). It'd still take an impossible amount of space to store.

I was about to write out that number, but 2^800000 is about 10^240000, and 240000 zeros would take about 2400 lines to write on my monitor, or about 48 screenfulls of nothing but 0s. (To compare, the number of particles in the universe would take up about 1 line.)

13. Originally Posted by MagiMaster
I feel as if I'm failing to convey the vastness of this number.
You mean the number you calculated as a result of the parameters I said don't matter? No you've conveyed that fairly well.

Also consider that there was a time when the foremost experts thought phone lines could never transmit data at more than 14.4Kb/s. They'd apparently proved that to a mathematical certainty; from what I'd heard anyway. Regardless it seems there are lots of examples of this in history, from which the pedants never seem to learn. Try not to be so quick to assume you know all the factors involved in assessing the future.

As far as the laws of mathematics refer to reality, they are not certain, as far as they are certain, they do not refer to reality. I think I heard someone smart once said that.

14. Err... No, if I had succeeded in conveying just how vast that number is, you'd have realized that there is no possible way, no matter how technology changes. Also, if you had followed along with my math, you'd have realized that reducing the parameters even further wouldn't really change that.

Let's cut it down to just a 256x256 picture in black and white. You can still fit a decent picture into that, but it's getting a bit tight. Such a picture would require 65536 or 2^16 bits to store and there are 2^(2^16) or 2^65536 such pictures. Using a De Bruijn sequence, we could store them all in 2^65536 bits. That'd still require storing something like 2^65236 bits of information on every particle in the universe. That's roughly a number with 20,000 decimal digits written on every single particle in the universe.

Well, how about a 64x64 black and white picture? That's roughly a forum avatar, though just in black and white. You can't really fit much in there, but there'd still be some interesting stuff. No, that'd still require roughly 2^3800 bits of information for every particle.

Down to an 8x8 black and white image. That'd need 2^64 bits. That's 2^20 or 1,048,576 terabytes. It's not entirely impossible that we could store than much info (there are probably well over 1,000,000 1TB drives out there, but good luck getting them together in any practical way), but you might can understand why this is generally impossible for anything of any reasonable image quality.

15. In your last example I'm pretty sure an 8x8 pixel image set would have 2^64 possible images @ 64 bits per image, putting the total data in the neighborhood of 150 million terabytes, rather than a mere 1 million. I'm probably not as good at the math as you though so I could be wrong.

...but if I'm right it only proves your point more, and I get it.

Still though...

Math tends to tell us what's impossible until it's done, and then tells us why it was so likely they'd have made a mistake like that.

This would be an interesting project, at least in my mind. If you don't think it's worth a go, I won't force you to participate.

16. 2^64=18,446,744,073,709,551,616.
That's 4,611,686 terabytes.
I assumed 4 bits per byte. I also used the rounded figure to convert to terabytes.
I used this online calculator; http://web2.0calc.com

17. There are 8 bits per byte, but yeah, it'd be about 2,000,000 terabytes, not 1,000,000.

18. Again though, 2^64 just gives you # of possible 1-bit 8x8 pixel combinations.

8*8 == 64 pixels, each of which can be either black or white, so there are 2^64 possible combinations.

Each of those images is then 64 bits, so 2^64*64.

Again I could be getting this wrong since math isn't my thing, but if so I'd be interested to know why.

With regard to the project,
It wouldn't be out of this Universe to have a program which would produce one image at a time.
the user could slide a cursor along a bar, which moves from the first image to the last image. Each sequential image has one change in its line of code.
The bar will be quite long, gonna need a zoom function.
The first image will be a black or white screen, and the last image, a white or black screen respectivly.

One could watch a slide show which moves through ie 10 frames a second, skipping say 1000 or a million images per frame.

Maybe skipping along the bar, at distances according to various algorithms would produce nice results.
Perhaps moving through the prime numbers will show equazcion and Natalie every which way they can

20. Hmm... I missed the earlier reply somehow.

Let's see. Yes, each 8x8 image would be 64 pixels, and one bit per pixel. Which is then 2^64 pictures at 64 (= 2^6) bits each, so to store the whole collection uncompressed would need 2^70 bits. There are 2^43 bits in 1 Terabyte. So the whole collection would take 2^(70-43) or 2^27 Terabytes, or about 134,000,000 terabytes. Not so big that it wouldn't fit in the universe, but still pretty big. (So yeah, your math is correct. You're just not converting it to terabytes.)

The form of compression I mentioned earlier would reduce each image to 1 bit (on average), leaving you with about 2,000,000 Terabytes.

Anyway, yeah, doing it procedurally would be easy. Having some kind of fractal slider would make browsing easy enough, but your slideshow (10 frames a second, skipping 1,000,000 at a time) would take about 1,800,000,000,000 seconds, or almost 60,000 years. (I could write this program in just a few minutes really, if you want to sit around and watch it for 60,000 years.)

The second problem with browsing through it is that nearly all the pictures are just snow and there's no way of knowing where the interesting pictures would be.

21. The idea is akin to having a dozen monkeys splash ink on a sheet of paper then examining the work to determine whether they have discovered the absolute mass of the W boson.

There is a reason why people who have made huge contributions in the field of science are remembered and idolized. They shifted through the clutter of information, made connections, and presented beautiful, simple models that help us quantify physical phenomenon.

22. thanks everybody for chewing on my insane post. i know feasibility is impossible, but the exercise of pondering the contents of this rendered Database is endlessly enthralling. by the way: this entire forum thread is in the database. also in the database: the solution to how to efficiently store the database on every particle in the universe. also in the database: the solution to finding the good juicy images and wading out the noise images.

cheers

23. Originally Posted by pedronaut
Maybe skipping along the bar, at distances according to various algorithms would produce nice results.
Perhaps moving through the prime numbers will show equazcion and Natalie every which way they can
you totally are on to something!!! i love it. instruct the algorithm to only keep every bazillionth image, and then you have the greatest film in the universe ever made, going from all black, to all white, with relations with natalie portman in between

24. Some related stuff to chew on:

Every Icon Project Page
Every Icon

And of course Borges essentially had this same idea in 1941: The Library of Babel

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement