Idea: The Notebook Superdock

Right before I left on vacation, a few employees at work were about to get upgrades of new laptops, with laptop docks.

During my vacation, I saw this article about a laptop where the screen can be popped out to operate as a low power tablet.

So…. I thought to myself, why can’t we do things the other way? I am sure there are plenty of people that have both workstations and laptops, so why don’t we create a dock that not only provides instant connection to ports, but also provides access to more CPU, RAM, Hard disk space, etc?

Some advantages:

  • Lower maintenance issues, as the user doesn’t have to ensure that multiple systems are up to date.
  • No need to configure multiple systems, as you are literally taking the same system with you everywhere, just having it work better when you are in a non-mobile workspace
  • Provides the convenience of a laptop, with the power and upgradability of a desktop.
  • You could have it so that your laptop’s files are synched to the dock’s storage when you dock your laptop, making for easy backups.
  • Building on the above – reinstalling/dual booting wouldn’t be as big a deal if the bulk of data is stored/replicated to the docks, as restoring data would be trivial.
  • The above two operations should be faster than USB/NAS solutions.
  • Given the assumption that files are replicated to the dock, in the event that the laptop is lost/stolen/accidentally left at work, you could conceivably boot your system anyway (without the latest files of course) given that you could have a keyboard, mouse and monitor attached to the dock, and the dock basically would be a computer system.

I could see this being valuable for a number of people: namely anyone who has to give demos and meet with clients, but still needs a beefy computer for the non-demo portion of their job. Off the top of my head, this could include animators/graphic artists, and software developers.

To those who know there history, something similar was done with the Powerbook Duo Dock, which added more cache and RAM, but obviously did not take off. Maybe now that people are focusing on small, lightweight laptops, the idea might be worth a second look?

Thoughts?

Google App Engine

Today Google announced Google App Engine, a free (within certain limits) deployment environment in which people can develop, deploy and host web apps. Currently, the offering is for python only, and you are kind of tied to Google’s data store, but in return you don’t have to worry about server management, infrastructure, or paying money. Additionally, they have a pre-packaged SDK to give developers an identical development sandbox, as well as a method to deploy local code to the production environment.

However, I think they missed a way to make it even MORE awesome. Imagine, if you will, that they had integrated this package with Google Code. Suddenly, developers have an entire managed development and deploy framework. Develop on your workstation with an environment that is identical to the production environment, check in code so that it is managed, have a web interface that manages deployment from the repository so that you could deploy specific revisions/branches to either a staging area for testing or the production environment, with easy rollbacks if a problem is noticed since one of App Engine’s goals appears to be to lower the total cost of entry into the app development world, this just seems like it would make sense and work so well.

The Sushi Showdown of Ultimate Destiny

This is the Sushi Showdown of Ultimate Destiny
Good rolls, bad rolls, and assorted pieces of sashimi

and only one will survive,
I wonder who it will bet
this is the Sushi Showdown…

this is the Sushi Showdown…
this is the Sushi Showdown…
of Ultimate Destiny

It really just started off as a joke. Kate was torn between going to two restaurants (she had a gift certificate for one, but liked the other better), when I suggested the obvious solution: eat at both and compare the tastes! Better yet, eat at other sushi restaurants too and do a comparison across all of them! And thus the event ‘Sushi Crawl 2008′ was born.

Quickly, however, people began to think that a sushi analog to a pub crawl was not the greatest of ideas, due to amount of time sitting around for food preparation, and the fact that we would be ordering very little from each place in the grand scheme of things (you don’t want the sushi places you will potentially populate in the future to be mad at you). The event quickly evolved into a sushi potluck. Everyone would be responsible for a take out order from a restaurant, which would consist of no fewer than 4 items; spicy salmon roll, tuna sashimi, house roll, and whatever other item you wanted. Targets were selected, participants gathered, and we went off to do battle.

The original roll of opponents:

  • Hamachi House (Eventually skipped)
  • I Love Sushi
  • Sushi Nami
  • Dharma Sushi
  • Sushi Shige
  • Ichiban Sushi (Eventually skipped)
  • Minato Sushi
  • Doraku
  • Momoya (Dismissed due to ’suckitude’ long before the day of the event)

The day of the event, we ran into a few minor problems, such as a torrential downpour, last minute decision of who goes where, dead phones (which hampered ordering), power losses (Hamachi lost power due to the storm), restaurant closures (Ichiban doesn’t do dinner in March), and potentially sick people. Despite these issues, the event was oh so good. 10 participants * ~4 items * ~6 pieces per item = ~240 pieces of sushi to be delectably savored and compared. Long ago, friends of mine told me that the best way to eat sushi is to gather a group of people and all order various things on the menu and share, and I have to say, it certainly has its benefits. A healthy amount of ’safe’ selection (that is, items you have tried and enjoyed previously), along with a variety of new stuff without the disadvantage of being stuck with additional rolls if you don’t like it.

At the end of the night, Sushi Shige was declared the overall winner. The sashimi was moist and rich in flavor, the spicy rolls weren’t overly spicy, the pieces were all well proportioned. Of course, Sushi Shige was also the most expensive overall, so you are paying for what you are getting. For more budget concious sushi goers, it was generally agreed that Doraku was also a very solid choice for a reasonable price. Ironically, the two places are just a block a part.

The overall loser of the night was I Love Sushi, which was generally below the quality of everything else. Whether it is the worst of all in Halifax is debatable, however, since Momoya is apparently so bad that it wasn’t even worthy of ordering from, and since Hamachi House (from my own experience in the last year) has been more of a hit or miss situation.

Obviously, this event needs to be repeated (for a proper data comparison set of course) and I am already looking forward to the next one.

Cell Phone Shopping And The Tale Of The Really Old Javascript

Every so often, a friend of mine asks me why I don’t have a cell phone. I usually tell them how most cell phones cause feedback with my implant, either when I try to talk at them (feedback with the headset), or when placed near the processor (rare, but it recently happened with a Blackberry. Funnily enough, I could make calls with the Blackberry) and then the conversation moves on. Recently, however, a friend of mine came up with a solution (of sorts). He asked why I didn’t just get a phone and a text message plan. Then it doesn’t matter if I can’t make a call, I can still text people on the go.

Armed with this idea, I started to look around a bit. Shopping around on Telus, I stumbled upon the following while trying to add a service to my package.

Wow, interesting reccomended browsers. IE4? Netscape 4? AOL 4? Also, consider that Opera has had 128 bit encryption support since 3.0 (which I believe was before the year 2000), and Telus isn’t recognizing it. I wonder just how Telus is deciding if a browser supports 128 bit encryption. Seems to be a Javascript function….. (http://www.telusmobility.com/js/webapps.js) Hmmm…. looking at mytm_crosslinks (which is the function being called) it seems to check SSL compatibility based on browser version. And completely ignoring anything not IE or Netscape.

Well, I guess at least they are descriminating against all alternative browsers equally… though I wonder why they decided to write their own browser checking function (complete with redundant checks) rather than…. use one of the wildly available ones. Actually, given that they reccomend IE/Netscape 4, I wonder just how old this function is anyway, and why they are still using it.

A Conferencing We Shall Go

Ah….. hard to believe it has been almost 4 months since my last post. There always seems to be just one more thing that needs to be done.

The most recent and exciting item was a recent conference. A few months ago, myself, Oliver, and Chris were invited to a local conference called The Student Edge. The conference was partially organized by an ex-DSU president, and he kindly gave us the opportunity to talk about our experiences building software for the DSU. The conference was somewhat different than other ones I have been to, with roughly a third of the time spent on ad-hoc information exchange sessions, where the attendees basically shared their own stories (Societies Anonymous!), and the other 2/3rds more traditional talks.

Our talk, titled “Help Societies Help You: Using the Internet to Build a Better Campus” was well received by about a third of the attendees. We tried to not only tell our own story, but to convey the message that if 3 students can build some applications for the DSU, then maybe they can leverage their own student population in order to do some cool stuff.
A lot of work was put into this from the slides, to the videos that Chris put together, to the new site and demo that Oliver put together, as well as some kick ass business cards that Oliver managed to make at the last minute. Our talk ended up being almost exactly the length we expected (a somewhat new experience for me, maybe I am getting the hang of this conference thing!) partially due to the fact that Oliver went into some additional detail, whereas I talked a bit too fast again (ok, maybe my conference habits could still use a little work…). We had a little brainstorming/Q&A session after, which ended up being mostly Q&A, though there were several comments about things that people liked in what they saw.

The more interactive part was the Tech Showcase that we set up for the following day. This was where we received the bulk of feedback, conversation, and ideas. Many people were fairly positive about the systems we had. One individual who attended the talk brought over someone else from their student union at the demo station we had set up. Part way through our talk and demo, the person that had attended the talk kinda jumped in and demoed the functionality himself. I was pleasantly surprised by this, as I wasn’t expecting someone to become engaged so quickly.

Other people came over to talk about where we want to go with this, and where we think we stand with other systems, notably Facebook. This is a topic we anticipated and brought up in our talk, and some of the dialogue was pretty interesting. Mainly it comes down to access and community. Whereas Facebook is centred around an individual, and you find things by who you are friends with (for the most part), whereas our work centers around the community, and providing individuals the ability to find information without jumping through any hoops. There was also talk about integration and Facebook widgets, and what are take on those topics were.

Finally, we also got a glimpse of ‘the competition’. UWO’s student union apparently contracted out a similar system to ours (with a few key differences) and it was on display at the Tech Showcase. It was also a contrast to our showcase. UWO was shown as an organization willing to buy, and we were shown as an organization willing to build..

All in all it was a great conference and a great opportunity, and I am glad I was able to take advantage of it.

What’s Hot – my.dsu.ca

I heard a rumor that my.dsu.ca had been featured in this year’s Maclean’s Guide To Canadian Universities. Deciding to check this out, I came across the following in the “Campus Confidential: Straight from the Students” section of the Dalhousie University Entry:

What’s Hot

…Student union website my.dsu.ca for news and information about local events and campus societies.

It’s nice to know that something I have put a great deal of time into was worth mentioning by students to MacLean’s.

FreeBSD, Ruby And The Case Of The Failing Gems

The server I happen to host some of my stuff (ie. this site) happens to be running FreeBSD. In most cases, this doesn’t cause any problems. However, in the case of installing certain ruby gems, I would get an error when compiling native extensions. In particular, with the fastthread gem (I was trying to upgrade Mongrel), I received the output:


make install
/usr/bin/install -c -o root -g wheel -m 0755 fastthread.so /home/ssmith/gems/gems/fastthread-0.6.4.1/lib
install: /home/ssmith/gems/gems/fastthread-0.6.4.1/lib/fastthread.so: chown/chgrp: Operation not permitted

I received a similar error a while back when trying to install mongrel, and at the time I had asked one of the friendly admins to just install it as root. However, I like installing my gems locally, so they are easy to update, and this was preventing that. Additionally, I wasn’t experiencing this on another machine running Debian, so what was going on?

Some investigation showed that on Debian, install wasn’t being told to change the owner and group, while for some reason it was on the FreeBSD machine. So where was this being set?

Turns out that rbconfig.rb, a part of the actual Ruby distribution, was the culprit. A collection of configuration variables for Ruby, it had the following line:

CONFIG["INSTALL"] = /usr/bin/install -c -o root -g wheel

while on Debian we only had:

CONFIG["INSTALL"] = /usr/bin/install -c

Apparently, the FreeBSD Ruby port sets things a little bit differently, which causes pain for local installation of gems. I’ve currently had the problem fixed by having that line set to the same values as Debian. However, as that file is automatically generated each time Ruby is updated, hopefully the root issue will be fixed in the next version for FreeBSD.

TigerEvents 0.7.1 – It’s About Time

So….. I have actually had a release version of Tigerevents for a long time now. 0.7.0 is actually powering my.dsu.ca now, but was never officially released. As I added several features since then, as mentioned a while ago, I decided to just increment the release number, and push it out there.

This decision was prompted by an organization other than the DSU contacting me, saying they wanted to use it, and if I could just add this one little feature enhancement……

Needless to say, the idea of other people using this pleases me, and hopefully in the near future I can point out to other in production instances.

The code, as always, is available at the Sourceforge and Rubyforge locations for download.

PayPerPost – Interesting Idea, But How Are The Results?

Recently, PayPerPost has come up on several sites I monitor. Even more recently, several individuals and posts closer to home have discussed some of the issues, and as a result, I’ve decided to weigh in with my own two cents.
To start off, the idea of PayPerPost as a service makes a degree of sense. I mean, consumer reviews of products are useful in helping other consumers decide to use a product, and why shouldn’t a company spread some money around to help with potential advertising/feedback?

The problem, however, is with the end result. One comment I have read sums up a lot of my feelings about PayPerPost reviews, “There really isn’t much review on those reviews”. Really, the quality of a lot of reviews just really stinks. In some instances, the ‘reviews’ boil down to link spam, resulting in at least one index service delisting some web. However, individuals still get paid for these sorts of posts, so where is the incentive to improve upon quality? Thankfully, PayPerPost has already recognized this, and is close to introducing segmentation so that companies can choose quality over quantity. I know from my opinion, I prefer reviews where someone has actually used something, as opposed to casually looked it over, and as a result, when I do my own reviews (mostly books), I try to be sure to actually read the entire thing, and don’t hesitate to point out deficiencies.

Another problem I have is again centred around the actual bloggers. This problem is quantity and consistency. There are one or two sites I read that are basically reviews or news of products. I have no problems with these sites, as they are focused on a specific area. This focus actually enhances their credibility, as I am more likely to believe a software review from a guy who consistently reviews software than a guy who jumps from software, to loans, to a baby alligator pet show in downtown Manhattan. This problem becomes worse when a blogger posts several sponsored posts on a variety of disjoint topics making it that much harder for an individual to find any meat on the site.

While I am not expecting something super professional, it would be nice to see some extra thought in some of the many sponsored posts I have seen. Taking some extra time for quality, rather than several being quickly pumped out several sponsored posts could serve to improve credibility, and with the shift to segregation, some extra quality and credibility can only serve to improve the number of opportunities individuals have access to.

Google Analytics

Like other people I know, I have been testing Google Analytics. There are some good things about it, but overall I am not really impressed.

The Good:

Content Summary. Specifically the percentages that content get for a time range. I love easily seeing how traffic is growing or subsiding for individual (well, the top 5 anyway) articles. Hey! Recently my TurnItIn.com article traffic was up 50%! Instant, easy to read feedback like that is awesome.

Variable time range. I like being able to see data for time ranges other than a month (which is the standard for things like Webalizer). Google makes this simple to do.

Date Storage. Some free web site statistics programs only keep data for a certain amount of time, and only show you a limited range of that. For example, the free version of StatCounter only tracks 2 weeks. Analytics stores……. probably all information it has ever received, which really ties in with the variable time range above.

It IS powerful. If you have the time and desire, I am sure you can massage the data to give you tons of information (to bad I don’t care about most of it).

The Bad:

Does not integrate with other Google products. Google Sitemaps integration seems like a no-brainer, given that it is for related data. Google Maps for the map overlay would seem to make sense as well, so that I could actually zoom in on an area, rather than seeing a large grouping of dots along the east coast.

All the Ad related stuff. Ok, I understand that a lot of people want to track their ad revenue, $ index, whatever. But what about those who don’t? I personally find all the ad related stuff clutters the interface I am trying to use.

Unintuitive. The breakdown into Executive, Marker, and Webmaster sections makes finding things difficult. For a while, I couldn’t find a simple referring pages list. More recently I found it under Marketing Optimization -> Visitor Segment Performance -> Referring Source. I couldn’t figure out filtering for a while. Clicking a graphic to change it from exclude a filtered item, rather than filter by it didn’t seem obvious.

Flexible in some ways, obtuse in others. A lot of places, you can display up to 500 records. For the content summary, you only have the top 5. Sure, you can get most of the same information under Content Optimization -> Content Performance, but that section is missing the percentage changes over time (you know, what I thought was GOOD) that the summary has. Also, the referrer information only gives me the domain name, not which page the user came from.

The (Possibly) Ugly:

Google stores all this information, and uses it for their own purposes. ome individuals understandably have privacy concerns regarding this. As Ian said, it’s basically a toss up. Decide if its right for you.

Conclusion:

For something free, Analytics is better than some packages. I personally feel that I am presented with way more information than I really want. For something more revenue related, your milage may very.

« Previous PageNext Page »