Firstly, excuse me using UK English, but I was born and raised and reside in the UK!
Secondly, this is an incredibly basic question and I'm just checking that I got it right!
Now here's my question:
If my upload speed has been measured as 1Mbps and I send an email attachment which is 8MB in size, I have calculated that each MB of data should take 8 seconds to upload, so the whole thing should take ~ 1m 12s
Okay, I am labouring under the impression here that the Mbps in the upload speed refers to megabits and the MB in the file size refers to megabytes
I am assuming that in this context 8 bits = 1 byte
1Mbs = 1MB every 8 seconds, or indeed 6.5 MB per minute.
Is that right?
The only reason I'm asking is that it took a little longer and I just wanted to check that my calculations are right and that it was just bandwidth fluctuation rather than me not understanding the basic principals of file transfer speeds!