VHDL is a semi-broken language. Now, I definitely agree with Jan Decaluwe’s blog post that VHDL’s handling of determinism is a huge asset (see: Verilog’s Major Flaw). But from a programmer’s perspective the language is obnoxious to use. Some of it’s features are barely passable-like file i/o-and it takes a programmer far more brain function than it should to do things.
Why is this? It’s always seemed like the language was a bit schizophrenic. Judging by the steering committee’s history with shared variables, this might be because hardware engineers resisted software-like concepts early on. And from what I can tell, the IEEE has done a fairly poor job shepherding the language in the 3 decades since it was created. Why it was never open-sourced, I’ll never understand. I actually asked (who, I can’t remember) during a webinar Q&A on VHDL-2008 whether they’d consider open-sourcing the language, and was told that it “didn’t sound like a good idea.” Hmm. Maybe traditional EDA companies are to blame?
If I had more free time, I’d start writing one by myself. In fact, I almost did. I got so far as to write down what I would put into it. That list is reproduced here:
Serial to parallel
Parallel to serial
Synchronous & Asynchronous FIFOs
Data Packetizer/Depacketizers (data marshaling)
Processor Bus Interfaces
A/D, D/A Converter Interfaces
All of these should be easily customized through generics, and contain self-verifying unit testbenches and documentation.
Some might argue that there’s no need for many of these because there’s already purchasable as IP. In my experience, IP always requires more of your own engineering than you expect, despite having paid for it, and support is pretty terrible. The strength of an open-source framework is that it gets the most sets of eyes on the code and documentation to catch errors and continually improve the quality. Often times, purchased IP you can’t even look at the source code to find out what’s wrong! I’ve been burned by this before.
So now all we need is a catchy sounding name and to grab the domain.
Testing. It’s the most important part of engineering, whether its software or hardware. Yet all too often, it ends up getting sidestepped to get product out the door. It is woefully under-estimated in budgets and schedules. Then once the schedule starts slipping, engineers have trouble fighting the urge to skimp on testing to get things “done” faster. Most frustrating of all is the pressure from management to cut down testing hours in project plan estimates, because it “costs too much”, “takes too long”, or “doesn’t add enough value.”
By day, I’m an FPGA developer and write mostly VHDL. Despite what some software developers might think about FPGA developers, we do quite a bit of unit testing, albeit without a formal framework. And that’s the problem. As schedules start slipping, the stress and anxiety of missing a deadline cause testing rigor to decrease. Ironically, if the project had a formal unit test suite that could be relied upon, stress and anxiety would be lessened because engineers wouldn’t have to fear breaking something in the frantic push to ship on time. They could simply launch their regression tests and review the results.
The most difficult part with engineering that takes place inside a text editor is that once our creations “live” in the physical world, they become a lot more difficult to test and diagnose problems.
Suppose you’re an analog engineer designing a bandpass filter using discrete components. You just got the manufactured boards back. How do you know the filter works? Do you just look at it and say “well, it looks like a bandpass filter, so it probably works.” Of course not. You connect a signal generator up to the input, place a spectrum analyzer on the output, and you sweep across input frequencies and measure the output signal to confirm your filter has the correct attenuation for out-of-band frequencies, etc. If you were to discover a problem, you’d grab a Fluke and check to confirm that the passive components have the proper quantities. This is in essence, a unit test.
Test suites written with frameworks like JUnit give you that warm, fuzzy feeling that all of the little pieces that make up your design are working correctly. This reduces the stress on your engineers, which I believe has a strong, direct correlation with product quality.
VHDL desperately needs an open-source unit testing framework modeled after JUnit. In general, FPGA and ASIC development methodologies have severely lagged innovations in the software world. Kent Beck, the father of Test Driven Development, wrote his landmark Simple Smalltalk Testing: With Patterns paper in 1989. It’s now 21 years later, and there is no vhdlUnit.
It’s basically accepted as fact by now that the iPhone is coming to Verizon. When? Probably this Christmas, or early in 2011. The point is it’s going to happen, and when it does, things are going to get interesting.
AT&T is currently believed to be paying somewhere around $350-$400 in subsidy for each iPhone sold. That’s a nice, healthy chunk of change for Apple. Since AT&T has exclusivity of the iPhone, they have decided that this subsidy is an acceptable cost for the higher monthly service fees they receive from smartphone customers on their 2 year contracts.
Over on Verizon, there are now a bevy of Android based smartphones selling like hotcakes. Something more interesting is happening on the Android side though. Because there are so many handsets to choose from, you can get an Android phone for cheaper. Many Android handsets are priced below $99. The Motorola Droid has been out for only 9 months, and already is priced cheaper than at launch. Part of this is due to the Droid X’s release–only 8 months after the Droid. But the fact that Android handset makers need to be on a more aggressive release schedule is evidence of how hard they’re competing against each other.
Compare this to Apple. Their handsets don’t have to compete with other iOS based handsets on AT&T. As a result, Apple can keep their prices higher. Their latest iPhone model released every year in June has had the same price points: $99 and $199, since the iPhone 3G was released. They don’t have to drop their prices, because there’s no alternative iOS handset.
Will this change once the iPhone hits Verizon? I bet it will. How else can Verizon and AT&T compete for iPhone customers, if not on price of the handset? It makes more business sense to give customers a discount on the handset, rather than drop the price of service contracts. Dropping the price of the iPhone another $30-50 is a drop in the bucket compared to losing $10/month over the course of a 2 year contract. Despite the iPhone being available on Verizon and AT&T at that point, it’s not as though customers can jump ship to the other carrier–the cellular radios are incompatible.
When the iPhone comes to Verizon, expect to see the price drop. But this won’t affect Apple, mind you, they’ll get a slightly larger subsidy from the carriers as they compete for iPhone customer’s wallets.
By now everyone in the world has heard that Apple’s shiny new iPhone 4 allegedly has a fatal flaw worse than the Death Star. Consumer Reportsclaims that their “engineers have confirmed that iPhone 4 has an antenna problem…” and that it “…really is only with the iPhone 4.”
Oh really? Just watch this video I recorded this evening with the original iPhone 2G, on T-Mobile USA’s network:
Anand Lal Shimpi over at AnandTech did quite a rigorous investigation into iPhone 4’s reception and observed that:
“squeezing it really tightly, you can drop as much as 24 dB. Holding it naturally, I measured an average drop of 20 dB.”
Interesting! I measured an average drop of 20 dB when holding my iPhone 2G naturally, as seen in my video above. But here’s the kicker from Anand’s research:
“From my day of testing, I’ve determined that the iPhone 4 performs much better than the 3GS in situations where signal is very low, at -113 dBm (1 bar)…I can honestly say that I’ve never held onto so many calls and data simultaneously on 1 bar at -113 dBm as I have with the iPhone 4, so it’s readily apparent that the new baseband hardware is much more sensitive compared to what was in the 3GS. The difference is that reception is massively better on the iPhone 4 in actual use.”
In my opinion–and I happen to be an RF communications engineer at one of the largest cellphone designers in the world–the only thing Consumer Reports really can say with certainty is that signal strength depends on a variety of factors, one of which is dependent on how a phone is held. Any smartphone, or really any radio for that matter, will have it’s performance affected by how the antenna is placed, held, etc. It’s irresponsible and dishonest for them to claim anything otherwise.
But wait a minute! Consumer Reports said that they tested the iPhone 4 in a “signal proof room” that simulates “real life conditions”! Before addressing that, some background:
Your smartphone, and the cellular tower (or “base station”) both are really just digital radios. When you make phone calls, you’re talking on a walkie talkie that digitizes your voice and sends it to the base station using radio waves. Once there, the base station relays that voice data through the carrier’s network to the phone on the other end of the call, and vice versa.
For this whole system to work, you need to have a certain amount of Signal-to-Noise ratio, or SNR, at both the smartphone, and at the base station. SNR is measured in dBm, or decibels referenced to one milliwatt (mW), which is the standard unit of measure for RF engineers. If your SNR drops below the level of sensitivity of either radio, packets (small chunks of data, i.e. your voice or network synchronization traffic data) start getting dropped. If SNR is too low for too long, too many packets are dropped (this is where your voice starts “breaking up”) and then the call is dropped by the tower if SNR doesn’t recover to a level above sensitivity so that you don’t end up consuming precious bandwidth and preventing others from making calls.
What can cause SNR to drop? A whole slew of things:
Distance from the cell tower
Interference from other radio waves
Attenuation from buildings, trees, trucks passing by, and how you hold the phone 😉
Multipath distortion (a phenomenon in radio communications where multiple copies of the signal sent bounce off of buildings, etc. and arrive at slightly delayed times, causing inter-symbol interference (ISI). Think of it as the radio confusing 0’s and 1’s, causing corruption in packets sent. Though 2G, 3G, and 4G technologies do have equalization circuits and forward-error-correction (FEC) circuits that can help combat this, it does require a stronger SNR that otherwise to help decode the bits sent over the air.
So, back to the “real life condtions” in Consumer Report’s “signal proof room.” The room they’re referring to is what’s known as a screen room. In their video, they claim that this environment simulates “real life conditions”. No it doesn’t! Screen rooms are designed to test radio performance while ignoring real-life issues, such as multipath, deep fades, interference, etc.
Does shorting the two antennas together cause a degradation in the performance of iPhone 4’s antenna? Sure. Does holding any phone change the performance of the antenna? Absolutely. Is the iPhone 4’s antenna a flawed design? Absolutely not. Could it be better? Definitely…but so can anything. Anand does suggest that Apple should “add an insulative coating…or subsidize bumper cases”. I’m not sure I agree, at least not yet. Depending on how Apple designed their antenna and radio front end, they could improve radio performance with a software update–I’ve implemented algorithms that did precisely this.
All in all, it seems clear that Consumer Reports didn’t prove anything is “flawed” with the iPhone 4, and acted irresponsibly in making the claims they did. The evidence they gave doesn’t support their claims, and was more smoke and mirrors than concrete information. It’s going to be difficult for me to trust their reviews of products in the future. As for what Apple does next, stay tuned for their invitation only press conference, scheduled for this Friday.
If you’ve seen the latest salvo by Verizon’s marketing ‘geniuses’, you’re led to believe that only girly-girls buy iPhones. Apparently if you want to be a man, you need a DROID.
If you thought past TV spots for the DROID were bad, check out the latest. You’ll swear you can hear the ‘Team America, World Police’ theme song in the background.
This ad campaign seems hellbent on condemning the DROID to be a niche device rather than one with consumer mass-appeal. No wonder rumors of Google launching their own phone had everybody buzzing over the weekend. There are too many niche devices emerging on the Android platform, and Google is rapidly turning into the Microsoft of smartphones by providing the OS to hardware manufacturers but not launching any devices themselves.
It’s not clear that replicating a Microsoft business model will be profitable for the likes of Motorola, HTC, et. al. Just look at what has happened to PC margins over the past few years: you can go buy a netbook for $200 at razor-thin margins to the manufacturer, yet Apple continues to grow their laptop and desktop market share while commanding margins in excess of 30%.
Google may not be a hardware company, but Motorola better hope that they’re not thinking, “oh this is why Apple made their own phone.”
Google has announced Chrome OS, an operating system built off of the Linux kernel for desktop computers and netbooks. Are you as unsurprised as I am? I hope to god Microsoft isn’t. But then again, ever since the release of Windows XP, Microsoft has been running around like a chicken with it’s head cut off.
This, dear friends, was inevitable. But many people may be asking, why? Just like many asked the same question when Google released their Chrome Browser.
It has nothing to do with the fact that Google and Microsoft are competing like, well, Goliath and Goliath. It runs much deeper than that. As years have gone by, operating systems (well, Microsoft’s in particular) have gotten way too bloated, and the Internet has gotten way too fast. You know the routine by now: every 2-3 years, you buy a new PC, with a faster processor, faster graphics processor, bigger hard drive, and with a new version of Windows on it. And I’m sure plenty of you asked, “Why? The Internet looks the same on all of these computers!”. You’re exactly right.
How much time does anyone spend on their computer these days that’s not in a web-browser, using web-based email or web-based instant messenger? The Web Browser is the new OS (well, at least the part of it you see). Anyone still surprised that Google’s web browser and newly announced OS share the same name?
In the last few years, web technologies have gotten incredibly good at making you feel like the browser is the same as your desktop. AJAX really opened the floodgates for web-based applications that behaved like “regular” programs. As a result, Google Docs have gotten pretty damn good. In fact, I don’t even own a copy of Microsoft Office anymore, I just do all of my documents in Google Docs. I have yet to find a feature that I need they don’t have, and my documents are available to me anywhere, because they’re in the cloud.
More recent developments are pushing us even closer to a web browser OS world.
What about the HTML 5 video standard? Once codecs are decided on, embedding video will be as easy as embedding an image in your HTML. This means Flash video will be rendered obselete. Apple doesn’t look so dumb after all not worrying about Flash running on the iPhone now do they?
So, the web browser becomes the artist’s blank page, for software slingers to serve up applications for email, photo editing, making presentations, editing documents. And underneath will be the OS kernel, providing access to the hardware on your device. And that’s it. It doesn’t have to be complicated, because you don’t need to have all your software installed locally. It’s in the cloud, where it belongs.
Each year there is a competition known well by those in the field of artificial intelligence called the Loebner Prize. The ultimate goal of artificial intelligence is to make it indistinguishable from real intelligence (so someday geeks like me can just code up a friend in C++). In the 1950s, Alan Turing proposed a test–cleverly named the Turing Test–to measure an AI’s level of intelligence, which the Loebner Prize competition uses to select their winner. The contestants basically write software to implement their artificial entities, and then judges chat with both real humans and the artificial entities and then guess which ones are real humans and which ones are not.
Criticisms of the accuracy of the Turing Test aside, today I saw something that reminded me of just how far off truly intelligent artificial beings really are. I was watching a video on YouTube from Bill Maher’s TV show and he threw out some statistic like “8 kids are shot and killed with guns every day in America.” Since that seemed like an awfully high number I decided to check into it. So I pull up Google and type in “kids shot”. Not my best Google search query ever, but I figured it would get the job done. Google dutifully returned a saddening number of news stories about kids being shot by guns. It also came back with a single advertisement generated by good ‘ole Adwords…
Not only is the statistic true, if you click on that advertisement, it goes to a toy gun that Target is selling! If Target ran a TV ad for toy guns right after a news story of a kid getting shot, they’d have a PR nightmare on their hands. Lucky for them, you can’t yet blame a computer for doing the Adwords equivalent.