Computers

The wonderful devices that make the world go round.

The Great Internet Fire

I haven’t posted in a long time. So if you’ve been here before, then welcome back. The recent throttling of the Santa Clara fire department is making this a critical issue. I feel compelled to talk about this because it’s something I’ve thought about for a long time. Plus organizing and writing out my thoughts helps me explain it better to family and friends.

Here is some context about why I’m writing this post. Santa Clara FD didn’t have an unlimited unthrottled plan when they should have. Verizon throttled them while they were fighting huge wildfires in California. I’m not going to comment about the problems between Santa Clara FD and Verizon. But I do have a lot to say about throttling and data caps in general.

Throttling became a thing when ISP’s stopped charging per megabyte rates. This BS would go away if ISPs started offering rated plans again. Only instead of dollars per megabyte it would be dollars per gigabyte.

T-Mobile and Verizon are tier 3 ISPs. That means they don’t have physical connections to other geographical regions. So they have to rely on others to make those connections for them. Companies like Level3 and AT&T are tier 1 ISPs. Tier 1 ISPs have physical connections spanning the entire world. Tier 3 ISPs make paid peering agreements with upstream tier 1 ISPs. This let’s them “see” the rest of the Internet. Tier 1 providers have always charged a per gigabyte/terabyte rate. At some point Tier 3 providers stopped charging per megabyte rates. Which I assume is because it’s easier on the consumer or for marketing reasons. Throttling wasn’t an issue before because the physical data rates were not that fast.

Consumption has always grown as the pipes got fatter. When Netflix and it’s kin showed up consumption exploded. You could use over a terabyte a month on a fast cable connection. The fixed monthly rate doesn’t jive with how people use the Internet today. I’ve heard of Netflix being used as background noise like it’s some kind of broadcast FM whitespace. 1080p video with 5.1 channel surround sound as background noise! That’s insane to me. It’s like running the hot water all day to humidify your house.

Tier 3 ISPs make their money by building out smaller networks to individuals. They divy up the bandwidth to each user based on how much profit margin they want. The ISP loses money if an individual user goes over their allotted margin. This is balanced out by other users who don’t use hardly any bandwidth.

The ISP for my home is AT&T U-verse and it has a rate cap of 1024 gigabytes. The line rate is 24 megabits per second and is the best achievable rate for my area. I’ve used 402GB this month and 90% of that is Steam and online video content. AT&T’s overage policy is very relaxed. This is as it should be with a Tier 1 provider. They only charge you the overage if you exceed your monthly cap three times. That overage is $10 per 50GB up to $100 total for U-verse customers. You can get an unlimited allowance for an extra $30 a month or by purchasing DIRECTV or U-verse TV on a combined plan. [1] I could use 1500GB per month and only pay an extra $100.

Nowadays you don’t pay for raw speed. What you DO pay for is the amount of profit margin you eat into. You can get an idea of how much ISPs divy up the bandwidth by looking at the rate cap for similarly priced plans. The total price for my limited plan above, including leases, taxes and fees, is about $70 to $80 a month. This number is a bit inaccurate because I  recently changed my plan. It also has a discount applied because it came from an upgrade deal.

At 24 megabits per second I could download 7.884 TB (terabytes) of data in a month. Assuming I used up my data cap three times already, I would pay $180. On the unlimited plan I would pay $110. From 7.884 TB (terabytes), the rated price works out to $13.95/TB and $22.83/TB for each plan respectively. Assuming I wasn’t a crazy person and only downloaded 1.5 TB (terabytes), it works out to about $73.33/TB and $120.00/TB. If you work the sane & limited 1.5TB rate into gigabytes then it’s $0.12 per gigabyte.

In my mind 12 cents per gigabyte doesn’t sound too bad. There are 8 days left in the billing cycle at the time I checked my usage. I have used about 402 gigabytes so far. The rated price I would pay for 1.5TB is $48.24. This includes the equipment lease plus all taxes and fees. I would be completely OK with paying a usage rate subsidized by a $25.00 fixed rate. My natural gas provider does this. It makes sense when you take into account the peaks and lulls during winter and summer. Internet usage rates don’t change anyway near as much as natural gas. But the idea still fits somewhat.

ISPs have been too reliant on nobody using the bandwidth that was “promised”. Hardcore Internet users absolutely kill profit margins for tier 3 ISPs. Everyone should start paying a rated price again. Maybe then, they would stop using Netflix as background noise and sucking up bandwidth like it’s air.

I purposely set YouTube to 1440p because my monitor resolution is 1600×900. 1080p doesn’t fully account for encoding noise so I went one step higher. This gives me the sharpest picture possible without hogging too much bandwidth. I could watch videos in 4K but the extra resolution would just go to waste. Netflix resolution is moot for me because I don’t have the 4K plan.

Here is the math for the $48.24 result I got. All data rate calculations were done with base 10. Base 2 would be the pedantic way of doing things, but base 10 is easier to understand.

402GB currently monthly usage
$10.00 per 50GB overage
$100.00 U-verse maximum overage

Get the maximum “paid for” overage.
$100 / $10 = 10
10 * 50GB + 1000GB = 1500GB

Add overage to approximate monthly fixed rate
$80.00 + $100.00 = $180.00

Find the rated price per terabyte.
$180.00 / 1.5TB = $120.00/TB

Convert terabytes to gigabytes
$120.00/TB / 1000 = $0.12

Find rated monthly price
402GB * $0.12 = $48.24

Posted by admin in Computers, Things that Suck, 0 comments

GCC was here

GNU did that part on the end.

Kudos to Google Books Ngram Viewer – graph link

Posted by admin in Computers, Programming, Random, 0 comments

Yay!!! New Arduino RNG

Here are the results of the new random number generator I built yesterday.
The old random number generator was not putting out an unbiased stream of bits.
I tried using debiasing algorithms with the old generator but it was not enough.

ent LOGGER00.CSV

Entropy is how random a piece of data is, basically how many random bits are in one byte.
True randomness is 8.0 bits per byte.
We are really close. Yay!!! Five Nines!!! Statistically Significant!!! Yay!!!
Entropy = 7.999993 bits per byte.

Basically says how small you could get it with a zip file
Zip files and pretty much all compression schemes do their job by removing patterned and repeating sequences of bits
Optimum compression would reduce the size
of this 25874432 byte file by 0 percent.

Can’t quite figure out what a good figure is on this test
but 50 percent with an average of 254.05 seems good
Chi square distribution for 25874432 samples is 254.05, and randomly
would exceed this value 50.50 percent of the times.

Do the bytes average out to be in the middle
Tests the ratio of ones to zeros in the bit stream
Arithmetic mean value of data bytes is 127.4920 (127.5 = random).

Can we calculate Pi correctly, if so, we pass. Yay for Pi!!! Yay for Pie!!!
Monte Carlo value for Pi is 3.141568568 (error 0.00 percent).

Assuming this tests byte to byte similarities
Which means is the current byte similar to the last one
Serial correlation coefficient is -0.000016 (totally uncorrelated = 0.0).

True_Random_Logger

Random class for Arduino

Logger class for Arduino

Schematic

The diode is a 12.1V zener and C1 is a ceramic disc type capacitor. The zener diode’s part number is an artefact of the designer I used so you can ignore it.

[svg src=/wp-content/uploads/2012/05/RNG2_schem.svg width=550 height=550]


Full Size

Posted by admin in Computers, Electronics, Programming, 3 comments

Bad NGEN, Bad

A few months back decided to try out Visual Studio 2010 which uses version 4 of the .NET Framework. So I proceeded to install everything and after a few reboots everything was done. However my boot process was no longer showing me its race cars and rockets. No, it was more like turtles and slugs. The startup application loading process would stop halfway for about 2 minutes and after that everything would start loading again. What was weird was that the windows firewall built-in to Windows XP, my anti-virus, and networking all would not load. Of course after the initial 2 minutes of wondering whether I’m going to have to install the OS `again`, everything proceeded normally. So I did what I normally do when things act up. Which is dig in the forums, blogs, and general internet. Eventually I found on my own that the new version 4 of the NGEN service was to blame. For some reason it was failing to load or something and was holding everything else up in the process. After googling some more and turning up empty I decided to just disable the NGEN service at startup, only starting it after everything else had loaded.

Recently I found some new forum posts from August that described using the command “ngen update” from the directory “C:\Windows\Microsoft.NET\Framework\v4.0.30319\”. After waiting for what seemed like forever, I set the NGEN v4 Service back to Auto so it would start at the next boot. I rebooted and logged in and to my amazement everything was fixed. No disabled firewall or anti-virus and the network was immediately available. It ran just like it was before when the service was disabled but without the 2 minute wait times. So I’m very happy to have this problem fixed. Just another bug and another day.

Posted by admin in Computers, 0 comments

HDD Performance

Auslogics_Disk_Defrag_thumb In recent months there has been some buzz about SSDs, otherwise known as Solid State Drives. Performance from some of these drives is better than their mechanical counterparts. With access times in the nanosecond range its no wonder they appeal to speed junkies. But for the rest of us who either don’t have the cash or can’t upgrade an older machine. For example, that old IDE laptop you got sitting around. Fortunately there are some things you can do to give your old clunker a little more zip. One thing is to defrag your hard drive. Before I found my current defragmentation program, I used Defraggler from Piriform. It however was too slow for me. It would move a few blocks and then sit there and think about what to move next. I’m not really a waiting kind of person. So I started looking for something new. That’s when I found Auslogics Disk Defrag.

Auslogics Disk Defrag is much faster than Defraggler and does a lot more. It is able to defragment multiple disks at once and it has an auto defrag function to defrag the hard drive when the computer is idle. Now I know this function is built-in to the windows defragging program, but with Auslogics you can specify the parameters for idleness. On my machine there is always a program running that occasionally uses more than 10% of the CPU, which is what windows considers idle. This causes the windows auto defrag to never run so having the option enables me to fix that problem. The drive map was neat but not to much different from Defraggler. However you can change the color theme to match another program. Since I was used to my old program I chose the Defraggler theme.

The only downside to Auslogics Disk Defrag is that it has some advertising for their other products in the program. The system health function doesn’t actually do anything that I can tell. I think it just spits out a random number to get you to download their System Cleaner program. I use CCleaner for that, which is a great program from Pirifrom.

So if you have an old clunker like me and are looking for ways to put some umph back into it without dropping some cash for an upgrade. Then Auslogics Disk Defrag is a free and fast alternative. It does wonders for my virtual machine disk files. You can download it at http://www.auslogics.com/en/software/disk-defrag/

Posted by admin in Computers, 0 comments

Adventures in Upgrading to Lucid Lynx

>

Just upgraded my Ubuntu partition to Lucid Lynx today. Every thing installed fine but after every thing was done and I rebooted I noticed desktop effects in compiz were disabled. So I thought to myself I know they worked before the upgrade. Then I started digging and I came upon an entry in the Xorg log that said “Failed to initialize GLX extension (Compatible NVIDIA X driver not found)”. The Nvidia part was the tipoff , since I have an ATI card. So I did some more digging on the Internet and I came across this page Desktop effects will not activate in Lucid Lynx. On page 3 it said to remove all packages related to Nvidia. So I did “sudo apt-get –purge remove nvidia*” I’m still not sure how nvidia drivers got in the system during the upgrade but after I rebooted everything was fine.

Posted by admin in Computers, 0 comments

Visual Studio 2010 Review

I recently downloaded Visual Studio 2010 from Microsoft’s Dreamspark website. After a nearly 2 hour install completed, I promptly opened it up and started exploring.

The first time opened it I noticed that the user interface is drawn with WPF. Now I’m not opposed to this but, the code editor window just doesn’t seem as responsive as earlier versions. This may just be due to the fact that its running on a 4 year old single-core laptop.

The next thing I checked out was the extension manager. With it you can browse for your favorite extensions without opening the browser. It also can check for updated extensions automatically.

In Visual Studio 2008, coding in C# had its downsides compared to VB. For example, Intellisense only triggered when accessing a member and if you did not spell the member’s parent correctly you get nothing. This has been fixed in VS2010. I am now able to type my using statements with ease.

I have not been able to try it with multiple monitors since I only have one at the moment. Although, it does sound like a neat feature.

Overall I do like this new version of Visual Studio. It has some neat features and I’m still exploring.

Posted by admin in Computers, Programming, 0 comments