Skip to content
Jan 23 15

Virtual Reality vs. Augmented Reality vs. Holograms

by Dave Davis

[Cross-posted from blog.davemdavis.net]

hololensOn January 21, 2015, Microsoft announced that the science fiction of holograms has become science fact.  They announced a new product, based on Windows 10, called HoloLens,  the first self contained, wearable computer that can create holograms.  This announcement has generated a buzz. If you haven’t seen the video Microsoft put out, take a minute, follow the link above and watch it, I’ll wait….. You’re back.  Were you blown away? I was. My mind was immediately racing to what problems I could solve if this truly pans out. More on that in a bit.

Virtual Reality

“Virtual Reality (VR), sometimes referred to as immersive multimedia, is a computer-simulated environment that can simulate physical presence in places in the real world or imagined worlds. Virtual reality can recreate sensory experiences, which include virtual taste, sight, smell, sound, touch, etc.” Wikipedia

When Microsoft announced HoloLens, some people mistakenly called it “virtual reality.” Although Microsoft showed immersive experiences, the fact that you can still see the world around you precludes it from being virtual reality.  A prime example of virtual reality is the Oculus Rift

Augmented Reality

“Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.” Wikipedia

HoloLens is really just augmented reality plus much more. I will go into what I mean in a bit. Augmented reality is not really new.  There are phone apps such as Yelp that use the phone’s camera to display the world around it while superimposing restaurant information based on the direction the phone is pointed.  There is also the translator app for Windows phone that superimposes translated text over written text, allowing you to switch between languages.

Another recent example is Google Glass (though they have suspended the program).  Google Glass is a pair of glasses that puts a heads up display on the lens, providing information to the wearer. That information is in a static location, no matter which direction the user is facing.

Hologram

“Holography is a technique which enables three-dimensional images (holograms) to be made. It involves the use of a laser, interference, diffraction, light intensity recording and suitable illumination of the recording. The image changes as the position and orientation of the viewing system changes in exactly the same way as if the object were still present, thus making the image appear three-dimensional.” Wikipedia

The HoloLens can create realistic three-dimensional images and place those images in the world around you.  So, I would say that HoloLens is a combination of all three concepts. Although it may not be truly creating holograms, they seem to be real enough to the wearer.

HoloLens

To be clear, I have not had an opportunity to try HoloLens.  Although I was not one of the chosen few who got to attend the event, the reactions from those who did get to try the canned demos were overwhelmingly positive.  Until I get to try it, I can only rely on what they have said and I am excited at the possibilities that this opens up.  I do have some questions as well. 

First, is Microsoft targeting consumer, or enterprise, or both? The thing that will really determine that is price. When the Xbox One first came out, it was $499 and adoption was slow. When they dropped the price to $350 this past holiday season, they sold like hotcakes. Granted, there is no direct competitor to HoloLens (as of yet), but if they price it too high, it may be just out of reach for the average consumer.

The next question I have is with the form factor itself.  If this is intended to be worn for long periods of time, it needs to be comfortable.  Google Glass was a pair of glasses, so they were easy to wear for long periods of time.  HoloLens will have way more functionality than Google Glass.  All that functionality requires some pretty heavy computing power. Microsoft has packed all of that computing power into a self-contained device, or “donut,” as my coworker likes to call it. Is v1 going to be too big or too bulky?  With all that computing power, what is the battery life going to be?

Finally, can Microsoft truly deliver on the experience they showed in the videos?  That will be the true test to the success of the device. Judging from the reaction of the reporters at the event, they are pretty close. Microsoft has gotten a lot of people excited with this announcement, a lot of people that have all but written them off. If they mess this up, they may drive those people away for good.

The Possibilities

A couple of years ago, Microsoft released a vision video that captured my imagination.  They showed off a lot of “imagined” technologies. They showed how technology will blend into the environment around you and become ingrained in everyday life.  Most of the stuff they showed was not real, but with HoloLens and Surface Hub, some of those use cases are now possible.

I am excited at the possibilities this opens up. Microsoft has said that HoloLens apps are just Universal Apps with some added APIs. Hopefully, they will release an SDK during their Build conference.  If you weren’t able to get in or can’t attend, they usually make the sessions available online soon after. 

The video that Microsoft released shows all kinds of use cases for HoloLens. I have a few of my own and I am excited to see what others come up with.

Summary

In recent years, these press events had very few surprises. Look for instance at the last Apple launch; there was nothing announced that had not previously leaked. Microsoft did a great job keeping the HoloLens a secret.  There were rumors of an Xbox gaming helmet, but this is so much more.  You can see pieces of these technologies in various Microsoft research projects. It is great to see them finally capitalizing on some of that research.  Only time will tell if HoloLens will be a success, but you have to admit that living in a time where holograms can be real is pretty cool.

Jan 14 15

Six SQL Server Resolutions for 2015

by Bill Lescher

As we embark on the 22nd year of everyone’s favorite RDBMS, I decided to create a tuple of SQL Server New Year’s resolutions.  Hopefully you can find some things in this list that ring true for you.

Test your database backups – When is the last time you successfully restored a production database backup file?  Ideally this is a regularly scheduled process.  Make sure your backups are good, and make sure you know what to do in the event of an emergency.  Do you have scripts ready to restore to a point in time if you had to?

Update your maintenance jobs – Are your databases being maintained properly?  If you haven’t looked under the hood of your database maintenance jobs lately, now is a good time to make sure your indexes, statistics and consistency checks are all squared away.  If you’re already using best-in-class scripts, like Ola Hallengren’s, double check that you have the latest version and are taking advantage of all the spectacular options available.

Implement a baseline – Do you know what your SQL Server looks like under normal conditions?  When someone complains that the system is “slow”, can you tell if something unusual is happening?  If not, it’s time to start collecting some metrics.  Create a simple database and one SQL Agent job with a handful of steps to capture the basics:  CPU Usage, Memory Usage, I/O, Index Usage, and Top Queries.  Keep an eye on the database size, and be sure to setup a purge process.

Study up on DMVs – I don’t know if there is anyone who has completely mastered the SQL Server system catalog.  I do know that there is always another gem of a diagnostic query out there just waiting for me to learn about.  My favorite authority on the subject is Glenn Berry.  His scripts are priceless.

Learn Extended Events – In a crunch it’s easiest to fall back on good old SQL Server Profiler, but you know it’s time to bite the bullet and learn how to use Extended Events.

Attend a user group meeting – If you’re not already doing so, get yourself out to a local PASS chapter meeting.  Even if you’re shy and/or well-versed in the topic being discussed, just sitting in a room with other database professionals can be inspiring.  It’s nice to be reminded that there are others out there with the same challenges you face.

There you have it.  With only 6 resolutions, you could procrastinate for 2 months on each task before you’re ready for the 2016 list.

What are your SQL Server resolutions for 2015?

Jan 8 15

Throw a life preserver to that corrupt PowerPivot model

by Gene Furibondo

Recently, I was in the throes of writing some deep, well thought out, frustratingly simple yet mind-numbingly complex DAX calculations. I had things just about where I wanted them and had started cleaning up my model a bit by doing some typical housekeeping (renaming, reordering, etc). I don’t know for sure if that did it but I am pretty sure. By “it”, I mean leave my model in a state where I could not modify anything leaving me in a mix of blind rage and baby-like tears. Here is the error message I received when trying to open PowerPivot.

I’ve posted the entire error message below for search sake but, long story short, my PowerPivot model was hosed. When I click ok to the above error message, I’m served with a blank ‘Grid’ PowerPivot canvas. Because you are smart, you’re thinking, try to switch to ‘Diagram’ view and make your changes there. Good idea. I was able to view my tables in ‘Diagram’ view. However, I could not extract the DAX calculations and any attempt to change anything in the model resulted in a never ending spinning icon.

I’d given up trying to recover the PowerPivot model in its entirety. If I could just get my hands on those sweet DAX calculations I had constructed, I could easily recreate the model itself. I tried EVERYTHING. I even tried opening the excel sheet in good ole notepad and extracting what I could out of there. I thought to myself, “There is no way this is going to work.” And I was right. It didn’t.

I was just about ready to give up when, what to my wondering eyes should appear, a related post about importing your PowerPivot model into an SSAS/Tabular instance. Need to give credit to Gerhard Brueckl for his write-up. Of course! If I could restore my broken down PowerPivot model into a new SSAS Tabular model, I may be able to save those captive DAX calculations. I fired up the trusty VM, and went to work. Open SSMS, connect to your Tabular instance of SSAS, right click on ‘Databases’ and select ‘Restore from PowerPivot’

It worked! I was able to restore into a SSAS Tabular model, then open that in SQL Data Tools where I could retrieve all of my DAX calculations. I have yet to figure out exactly what caused the corruption or if there is a cleaner way of fixing this but this worked for my purposes. I ended up recreating my model from scratch but most of the work (writing and testing those DAX calc’s) was already done.

Want another and perhaps more straightforward option? Open SQL Server Data Tools and use the handy wizard to create a new SSAS project using the ‘Import from PowerPivot’ type.

I sent this blog post around to people smarter than me for review and one particularly bright chap wrote back with tasty tidbit. After restoring your PowerPivot model into Tabular/SSAS, you can actually convert it back to an Excel PowerPivot file. Thus, completing the circle of life. There’s not a wizardy type interface to do this but this post walks through the steps pretty clearly. It’s a further testament to the fact that identical technologies are employed in both PowerPivot models in Excel and Tabular models in SSAS

 

 

 

============================

Error Message:

============================

 

An item with the same key has already been added.

 

============================

Call Stack:

============================

 

at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)

at Microsoft.AnalysisServices.Common.LinguisticModeling.SynonymModel.AddSynonymCollection(Tuple`2 measure, SynonymCollection synonyms)

at Microsoft.AnalysisServices.Common.LinguisticModeling.LinguisticSchemaLoader.DeserializeSynonymModelFromSchema()

at Microsoft.AnalysisServices.Common.SandboxEditor.LoadLinguisticDesignerState()

at Microsoft.AnalysisServices.Common.SandboxEditor.set_Sandbox(DataModelingSandbox value)

at Microsoft.AnalysisServices.XLHost.Modeler.ClientWindow.RefreshClientWindow(String tableName)

 

============================

Dec 10 14

Internet of Things, A Reference Architecture

by Bob Familiar

We at BlueMetal have a great deal of experience in creating solutions that leveraged connected devices streaming millions of records to cloud hosted repositories, leveraging historical and predictive analytics engines to provide insight and creating immersive experiences that give our clients agility and speed in their daily business activities. Case in point…EnerNOC.

EnerNOC

Using Lean Engineering a small team from BlueMetal was able to provide EnerNOC with an amazing user experience in 10 weeks. The application spans twenty four 55” high definition displays combining data streaming from over 35,000 devices around the globe connected to the energy grid with geo-political, social media, environmental and financial data creating a visually stunning global view of EnerNOC’s business domain that is visible to all employees and every visitor to their downtown Boston office.

clip_image002

While this solution was clearly focused on the energy sector, the effort produced a reference architecture for Internet of Things solutions that spans verticals. To prove out the viability of this reference architecture in other verticals, BlueMetal created a Pharmaceutical Trial Scenario and implemented a live reference implementation.

Read how we did it…

Nov 11 14

Lean Engineering – Lean Methodology Applied to Enterprise IT

by Bob Familiar

At BlueMetal we apply Lean Engineering principles to help our clients guide the creation and deployment of software products at high velocity with low risk. In this article, Bob Familiar, Practice Director for Cloud & Services, gives readers an overview of Lean Engineering, the historical underpinnings and detail on the principles that guide the process, methodology and architecture for the products we create with our clients.

Read more…

Nov 7 14

BlueMetal and Microsoft show how the Internet of Things will transform the way you do business

by Sadie Van Buren

small_moves_big_upside

 

The Internet of Things is not a futuristic technology vision. It’s here today, and you probably have elements in your operations already.

Join us for an inside look at how the Internet of Things can take your business to the next level.  BlueMetal and Microsoft are presenting the following two roundtable events:

December 2, 2014 – Microsoft – Great Valley Corporate Center – 45 Liberty Blvd, Suite 210 – Malvern, PA  19355 – health care focus

Click here to register for the Malvern event

December 5, 2014 – Microsoft - One Cambridge Center – 255 Main St., Cambridge MA - broad focus, keynoted by Dr. Abel Sanchez, Massachusetts Institute of Technology

Click here to register for the Cambridge event

Agenda for both events:

8:30–9:00 Registration and Breakfast
9:00–9:30 Keynote
9:30–10:30 Industry Solutions & Demos
10:30–11:30 Discussion and Close

We hope to see you there!

 

 

Oct 14 14

Team Wrap-Up from the 2014 Xamarin Evolve Conference in Atlanta

by Sadie Van Buren
20141009_Xamarin_Evolve_01_crop

Last week a group of BlueMetal’s software architects and engineers attended Xamarin’s Evolve conference in Atlanta,  and we wanted to share their key takeaways from this exciting conference.

Stelios Avramidis  Stel Avramidis:

Xamarin Evolve was a lot of fun.  Besides all of the great updates from Xamarin I had a chance to experience great sessions from some leading experts in UX mobility and design.  I had a chance to meet and socialize with fellow Bluemetalers from other offices as well as with Matt Larson, our partner manager from Xamarin.

My takeaways are not much of a surprise given the following major announcements:

• The new cross-platform Xamarin Profiler to profile applications on Xamarin.iOS and Xamarin.Android.  This is a huge improvement from their previous mono log profiler.  The new profiler looks and behaves more like Apple’s instruments profiling tool.  Although you could profile a Xamarin.IOS application using native tools such as Instruments it did not provide a consistent cross-platform unified experience for both Xamarin.IOS and Xamarin.Android.

• Another major announcement was the Xamarin Android Player.  This is a godsend for anyone who has done Android development.  The Android emulator provided by Google takes eons to bootstrap and load an application for a debug session.  The new player from Xamarin cuts that time significantly from minutes to seconds.  Anyone doing native Android development should feel jealous and should be sending complaint letters to Google to do the same with their native Android emulator.  Just proves that Xamarin is a great company building great tools and a smart company who knows that keeping developers productive and happy is integral to their success.

• Conversely, Sketches was announced.  Another great tool along the theme of keeping developer productivity and happiness high.  Sketches allows developers to improve on the typical write, build, test, and deploy workflow.  It allows developers to write code and instantly see the intermediate results without building and deploying their code.  This is not limited to just a watch-list output seen in a standard debug session, but what the UI may look like as well.  In sum, Sketches is a cross platform tool that allows developers to quickly prototype and iterate ideas when creating both Xamarin.IOS and Xamarin.Android applications.

• The other major announcement was related to improvements made to the Xamarin Test Cloud.  The Xamarin Test Cloud is an invaluable tool to any organization taking on a native cross platform project especially when it comes to testing Android devices.  Based on this article there are over 19,000 distinct Android devices in the world (http://9to5google.com/2014/08/21/there-are-almost-19000-distinct-android-running-devices-in-the-wild/).  The major takeaways was a new cross platform automated UI testing framework called Xamarin.UITest.  Also, test execution enhancements that allow for parallel execution of test scripts across devices in order to improve overall testing performance in the Test Cloud.  Before this feature submitting test scripts against the test cloud could create a queue of requests that ran one after another on the same device.   Finally, there was a new video playback capture of the application UI while its running though test scripts.

Some other notable announcements was Xamarin support for AWS mobile services, and support of XIB files for their Xamarin.IOS.  However, it was not disclosed when they are going to release the new designer features.  When they do roll out XIB file support developers will no longer need to use XCode’s Interface Builder to create views.

Travis Nielsen  Travis Nielsen:

I would say Xamarin is “evolving” to support the fuller spectrum of mobile development. We now have Sketches for quick prototyping, upgraded platform tools for building (resource monitor and Android virtual machine), a pretty sweet looking testing platform (Xamarin Test Cloud), and Xamarin Insights, a real time monitoring tool that gives you pretty rich user behavior reporting, error logging, and user notifications (“we fixed that bug you were complaining about”).

So Xamarin is becoming more involved in the project plan as you move from left to right…which is smart and necessary. Test Cloud in particular is a potentially killer feature. The company is committed to getting developers tools to make mobile as awesome as possible.

In general, the Xamarin team seems to be growing at a pretty amazing pace. They were very engaged with conference participants and they’re easy to talk to (and work with).

Roman Yugov  Roman Yugov:

XAML/MVVM XAML support was very basic in Xamarin Forms – it supported only object creation and initialization.  C# equivalent (using object Initializers) is equally concise and seems to be very popular. There is not yet a visual designer for XAML. In fact, Charles Petzold mentioned that the preview of his “Creating Mobile Apps with Xamarin.Forms” book did not contain any XAML code at all. Anyway it was still possible to create MVVM based application (and really cool ones as Charles presented in his “Xamarin.Forms is Cooler than You Think” session). Here is a game changer: Technical preview of the Xamarin Forms 1.3.0 release was announced on October 8. It has behaviors, triggers, styles, dynamic resources, styles based on dynamic resources. MVVM Light V5 with full Xamarin support was released by Laurent Bugnion on October 8. Xamarin Forms 1.3.0 with MVVM Light framework could be very attractive to WPF/SL/WinRT developers. It allows almost seamless transition into mobile development.

Custom renderers Another hot topic – may be the hottest, session room was packed. Custom renderer allows to implement platform-specific customization (look and behavior) of Xamarin Forms controls. This technique is especially important for custom control vendors. Major vendors already have native UI controls sets for Xamarin Forms.

Mobile backend as a service (MBaaS) There were two MBaaS providers which presented at the conference: KidoZen and AnyPresence (along with Microsoft and Amazon).

——————-

BlueMetal was a proud sponsor of this conference.

 

Oct 6 14

Bob German to present at two Beyond Tech-Ed events this week, 10/7 and 10/9

by Sadie Van Buren

Bob German will co-present a session on OneDrive for Business with Chris Chalmers at two Beyond Tech-Ed events this week:

October 7, 2014   Hartford Marriott 200 Columbus Blvd Hartford, CT 06103

Details and Registration for Hartford

October 9, 2014   Microsoft Corporation 255 Main Street Cambridge, MA 02142

Details and Registration for Cambridge

Beyond Tech-Ed 2014 brings you the latest information on Microsoft products and solutions based on the most popular sessions at Microsoft Tech-Ed, Microsoft’s premier technical conference. This free, one day event consists of four technical tracks delivered by Microsoft product experts.  Registration starts at 8:30am, and both Breakfast and Lunch will be served.

 

Sep 30 14

The secret to delivery – location, location, location

by James Horgan

Recently, BlueMetal was selected as a finalist for a best UX – product award at MITX, an esteemed network of technology marketers. This was for a large scale data visualization wallboard produced for the energy monitoring company EnerNOC.

What’s really impressive about this experience is we went from this:

Screen Shot 2014-04-30 at 11.26.28 AM

to this:

after1

in under 8 weeks.  Most of our clients and colleagues are astounded by this result and have many questions:

  • Which process did we use: Agile or Waterfall?
  • How many iterations?
  • When did the data architect come on board?

The answer is very simple – we put the team (a UX lead, a data architect, a UI engineer, an animator and a visual designer) all in one room, on site where the wallboard was being installed.

We used parallel streams of work to ensure design was aligned with architecture and that communication between the team was constant and consistent. Due to proximity with the client, the team was able to rapidly problem-solve on the fly. This became crucial as visual design and data got closer together.

There was a classic waterfall process but compressed: Discover, Define and Design, or as we call it: BlueSky, BluePrint and BlueMetal.

During BlueSky, interviews allowed the designer to sketch out the experience quickly, and the developers were part of that brainstorming. This allowed both the UI engineer and data architect to understand the scope of requirements, course-correcting as the design became more established.

With BluePrint we whiteboarded every piece of data that needed to be displayed on the wallboard in a single brainstorming session. Both the client and project team were part of this and a firm agreement was put in place that the brainstorming could not be considered complete until we were all in agreement on every piece of functionality. Drawing this line in the sand created intense focus and a rapid approach to design.

Finally during the BlueMetal stage (Design and Build) we ran separate streams of visual design, UI framework and data architecture, working side by side to not only converge on a rapid prototype but to iteratively create a solid high quality final deliverable.

The key to success on intense projects like this involves rapidly solving problems together, and a lot has to do with location.  By ensuring everyone was sitting side by side, every team member got equal accountability on the success or failure of the project, driving the team forward to ensure a favorable result.

An example is shown below of how location affects team dynamics.

workingtogether 

Though both team members are looking at the same screen, they are looking for different things. The visual designer on the left is looking at the readability of the typeface, the colors presented and the meaningfulness of what’s being shown. At the same time, the data architect is looking to see how the live data is feeding into interface and can not only troubleshoot, but work with the designer if the type size doesn’t work with the amount of characters the data is expected to pull in. Similarly, on the live file that both developers and designers worked on in WPF (Windows Presentation Format), designers used styleguide marking to indicate to the UI engineer how to adhere to the style, but also allowed the engineer to communicate in the same design language if certain parameters were not working.

usingstyleguides

Another simple example is the need for world clocks (shown below), something we observed was needed in the 24/7 work environment of EnerNOC. By having a cohesive team, the creation, placement, styling and implementation of the clocks was something we could easily do within a very short time span.

worldclocks

This type of constant communication, rapid problem solving and parallel streams of work can only happen when your team is in the same room, and you should always decide on WHERE your team delivers before you decide on HOW.

theteam

Sep 26 14

IoT with Azure Service Bus, Netduino and Gimbal

by Amol Ajgaonkar

Netduino is an open source electronics platform using the .NET Micro framework. I was using Netduino to create an IoT application which allows me to turn on the lights in the garage as I come closer to the garage.

Hardware:

  1. Gimbal (Proximity Sensor)
  2. Netduino Plus 2
  3. VeraLite (Z-wave controller)
  4. Z-wave enabled outlet.

Software:

  1. Azure Service Bus Queue – Setup to hold the commands/messages sent from the mobile App, consumed by Netduino/WCF service.
  2. Xamarin – To build the mobile app to detect the proximity sensor and send commands to the Netduino using the Azure Service Bus Queue.

This is the way I envisioned it to work:

IOT1

  1. The proximity sensor will be in the garage.
  2. The mobile app will get the proximity information from the sensor as the phone approaches the garage.
  3. If the user is close, the App will send a message to the Azure Service Bus queue.
  4. Netduino will poll to the queue for new messages.
  5. If Netduino finds a message, the command passed in the message is checked.
  6. The Netduino then connects to Veralite (Z-wave controller) and sends a command to turn the light on/off based on the command in the message queue.

In theory this all should work and it does in most part. I ran into a couple of issues here which I thought might be useful to anyone working on such projects.

  1. Proximity values from any sensor are always fluctuating. They are never precise. It only provides information that sensor is nearby. Don’t rely on the distance value. It is based on the strength of the signal and not reliable.
  2. Netduino runs on .Net micro framework. Currently, it does not support SSL. All the blogs I read said that SSL implementation takes up too much memory and Netduino does not have enough to support it. So, you cannot directly connect to the Azure Service Bus queue. Hopefully, someone will implement the SSL stack soon.

But if we modify the architecture a bit, we can overcome the SSL support issue. Here is the modified architecture that worked.

IOt2

I added an additional layer which acted as proxy between the Netduino and the Service Bus. Instead of Netduino connecting to the Azure Service Bus Queue endpoint, Netduino connects to the WCF service implemented behind the firewall. The WCF service checks for messages in the Azure Queue and returns the messages to the Netduino.

By using the Azure Service Bus Queues, we don’t have to open up ports on our router or setup port forwarding. Doing so, opens our network for all kinds of attacks. Not being a security expert, I would rather have all incoming ports shut tight than try and implement an authentication mechanism.