The latest copy of the Kanata North BIA Networker magazine is out and there is another article in the Serious Techie series…
Check it out
The latest copy of the Kanata North BIA Networker magazine is out and there is another article in the Serious Techie series…
Check it out
Over the past three years I have come to rely more and more on virtual machines. Virtual machines allow me to create snapshots of development and product environments that are easy to backup and share with others. What this means from a hardware point of view, is that I need more cores and more memory. It is not uncommon for me to be running anywhere from 2 to 8 virtual machines at any given time.
In each case the VM’s have to be configured to share the resources available on the host while leaving enough behind for the host OS. The work that I am doing at the moment requires target VM’s, Development VM’s and Product VM’s.
Some vendors were prepared for the Virtual Reality and have designed their platforms such that they can be easily expanded to accommodate more memory and cores. Surprisingly Apple is not one of these companies. I find this particularly difficult because my MacBook Pro is my main development machine.
I end up running VM’s on my company supplied windows laptop and a Dell T7500 server in our lab. That may be coming to an end though as companies try to wrestle virtual machines out of the hands of developers. It seems that IT departments in some bigger companies feel that they need to manage all virtual machines. It makes some sense in that IT departments generally have access to bigger systems that would be better suited to host VM’s.
I use VM’s to increase my productivity and moving control of those VM’s to people that are not at my beckoned call, will have the opposite effect with respect to productivity. I need to spin up and down VM’s at a moments notice and I need to be able to switch out OS’s, reconfigure networks and download and deploy VM’s from other vendors.
Some companies are restricting the virtual world further by restricting the environments where VM’s run. I and most of the rest of the development world uses the very popular VirtualBox environment from Oracle. I have recently been working with OpenStack and companies like Mirantis provide a quick start evaluation that contains four prepackaged VirtualBox VM’s. Their package automatically deploys the VM’s, configures the networks and spins up a cloud environment in minutes.
I have evaluated a number of environments like VMWare, VirtualBox and OpenStack and VirtualBox is perhaps the most up to date and functional for developers. Unfortunately VirtualBox seems to be under attack by IT departments who have invested heavily in VMWare. In the end it probably won’t matter as the world moves to Network Function Virtualization which is based on OpenStack.
For the time being however, the virtual reality is VirtualBox and developers need more cores and memory!
I’ve been reading a lot about what it means to be a “Software Professional” recently and I find it quite fascinating. The book that I am currently reading suggests that software professionals should treat their jobs the same way that medical professionals treat theirs. By that I mean that every deliverable should be completely defect free and should do no harm.
On the surface this seems like an obvious statement, but in actual fact, that’s not how the industry has been operating. Software designers are rarely given enough time to do the work necessary to test and refactor their work. The reality is more like every project is behind schedule by the time it is presented to the design team and the designer has to draw on magic to get things back on track.
What is magic you ask?
Magic is a compromise between quality and cost that produces a suboptimal result. There are a lot of outside factors that influence quality and depending on the application, quality may not be an issue. A “Proof of Concept” for a trade show is a good example of a situation where quality takes a back seat to “On time delivery”. In these situations however we must be very clear that this kind of code is fragile and purpose built with a very short life expectancy.
I find that life is very rarely black and white, but when it comes to software design, I’m a big fan of Test Driven Design (TDD) for all projects that contain code with a life expectancy greater than a few months. For a good description of TDD, check out “The Clean Coder” by Robert Martin.
Robert or “Uncle Bob” has summarized a number of scenarios in the first few chapters of his book that every software designer should read. He has drawn on his 40+ years in the industry to craft responses to a number of common situations that most of us have experienced. Let me know what you think.
Every morning I get copied on emails from people who have decided to Work From Home (WFH) that day. It took me quite a while to accept WFH as a valid alternative to being in the office. I understand that some large US companies have decided that WFH doesn’t actually work and have instituted a requirement for people to spend a number of hours in the office each week. In some extreme cases, I understand that employee’s are being told to relocate closer to an office or resign.
There are a number of advantages to being close to people when you are working on a project, but all of those examples generally create a work environment that is less than ideal. I know that when I work at home my efficiency increases by a surprising amount. That said there are different kinds of distractions like dishes, laundry, etc and it takes conscious effort to remind yourself that you are a professional and your time is not entirely your own.
I’m noticing a backward trend south of the boarder in many different areas, so I’m not surprised that this trend has started to effect technology workers. I think that this trend actually gives the rest of the world a chance to lead through example. I have adjusted my position on WFH and I believe that we may actually see companies spending a lot less on physical buildings. High speed, always on, network connections make distributed work forces actually work.
How do I know that WFH is working for my company?
In order for this approach to work, employees have to be able to connect ad hoc, with little or no notice. Employers may impose new requirements with respect to being reachable and we may see response times tracked as a metric. An unreachable employee doesn’t add a lot of value.
It will take some companies time to understand what WFH means for their people. It is not uncommon for companies to have concerns about employees working from home. A colleague of mine pointed out recently that if you don’t trust your employees, why did you hire them?
In a previous post, I suggested that work environment is something that will help retain employees and is actually valued more than salary in many cases. WFH gives the company a way to save on infrastructure and the employee a way to improve their work environment.
Sounds like a win – win to me.
It’s been my experience that the quantity of tech jobs tends to cycle between feast and famine. We seem to be entering a feast period where the number of jobs exceeds the number of candidates. Couple that with the fact that our economic system seems to balance wages to match the job demand and you have the makings of a perfect storm.
I’ve been eating out more lately for lunch and that’s not because I enjoy eating out, it’s because there have been so many farewell lunches. Tech workers in the trenches are the first to move because they are just starting their careers and they are the most attractive to employers. Let’s call this stage 1. Their skills are fresh and they have enough experience to contribute from day one. They tend to be the people that get forgotten when times are lean so employer’s don’t have to offer large increases to get them to jump.
When that well runs dry or when you need experience to guide the team, employers move up the food chain to attract people with more experience. Let’s call this stage 2. If employers don’t have their people locked in through stock options or competitive salaries and bonuses, the shell game continues.
I was speaking to a tech company CEO last weekend and he said that with the exception of the new grads that should be available in a few weeks, Stage 2 is already happening.
So what can you do to protect yourself from high employee turnover? Well some of the things that we talked about already, stock options, competitive salaries, bonuses, better work environments, more vacation, flexible work hours and locations, etc.
Tech workers are not your typical employees. They tend to be less motivated by cold hard cash and more motivated by work environment, interesting things to work on, career development, travel and the list goes on. Compensation does play a role, but t0 increase the output of your team, you have to do more.
There will be cases where it’s time to move on in order to advance a career and there is not much that an employer can do about that other than to say thank you for your service and all the best in your future endeavours.
I did come across a good book a couple of months back that I blogged about already though. Check out “The Alliance” by one of the founders of LinkedIn, Reid Hoffman. Reid suggests that what worked at LinkedIn was to treat every employee as a contract worker in that they have something to offer the company and the company has something to offer them. He calls these fixed periods “tours of duty”. It allows the employee and employer to have a clear understanding of what is expected and the fixed term bounds everything.
Many employees sign up for back-to-back tours of duty, but others do their tour and move on. If an employer wants to retain a key employee, they are motived to offer that person something that will help them achieve their life goals. I was fascinated as I read because I would love to work in a place like that.
In any case, the tech industry is entering a shake down period, so if you don’t want to suffer from (brain drain and retrain), it’s time to make a change.
I was recently introduced to a concept that has been around for a long time.
Test Driven Design strives for 100% test coverage for every line of code written. At a first glance this seems really expensive, but after reading “The Clean Coder” by Robert Martin, I find myself leaning in that direction.
Robert refers to himself as “Uncle Bob” and has a number of very strange videos that explain his approach to software design. His approach encourages designers to write tests first, code second and optimize third. He explains that software professionals have a responsibility to make sure that every line of code they produce has zero defects. Uncle Bob’s suggests that we all adopt a portion of the hippocratic oath, “Do no harm”. He explains that having a complete suite of tests keeps code alive because designers can do an instant verification cycle after each code change. This catches bugs early in the cycle and makes designers unafraid to make changes in code that they may not know all that well.
Uncle Bob explains that the up front investment is going to cost something, but not as much as we might think. The downstream value however, more than justifies the investment. Code estimates will go down with time instead of the more common increase to compensate for unexpected side effects.
As a software designer, I found myself saying “Yes” out loud a few too many times as I read. I would love to work on a code base that had 100% test coverage and I would feel much better about myself knowing that my code had zero defects. I think we all want to do the best possible job and taking pride in what we do makes us feel good about ourselves. Feeling good, makes going to work more enjoyable and that has benefits for us and our employers.
I only had to read the situations described in the first chapter to be hooked. I highly recommend this book as a must read for all software professionals. For those of us who have been around the block, it provides new perspectives on situations that we have already experienced.
One could say that I’ve been around the block when it comes to tech companies and start-ups, so a question that I get quite often from investors is “Is this a good investment?”
It’s not a really easy question to answer, but there are some indicators that you can use to weigh your decision. This is by no means a complete list, but it might be a good check list to determine whether further investigation is justified.
I could go on for pages but rather than do that, I suggest that you read “The start-up owners manual” for more insight into what makes a successful start-up.
For me, it comes down to technology and people. If those two things work, the rest will happen on it’s own. The people part is the hardest to get right.
I may have Blogged on this before but things are heating up with respect to tech jobs, so it’s probably a good idea to revisit this topic.
There are a number of job search engines that troll job posting sites and deliver lists to prospective tech workers each day. The content of the emails has some value but probably not what you would expect. These engines give you information about when the job was posted, but the way they work has a few flaws. If a recruiter picks up a posting from an employers site, the date gets reset and a posting that was three weeks old looks like it was posted yesterday. In many cases the jobs that are posted have already been filled.
Applying to jobs from this sort of a list will not produce good results.
So what do these sites do for you? They give you a general idea about who is hiring. Use that information to research employers and then use LinkedIn to figure out who you know in the companies that you are interested in. LinkedIn will tell you about connections through other people in your network, so if you don’t have a direct connection, ask someone you do have a connection with to introduce you.
Once you have a connection, setup meetings over lunch or coffee. Tell people that you are interested in their company and you would like to learn more. People have to eat, like to drink coffee and love to talk about their work. There is no commitment at a meeting like that so the odds that you will be successful in getting a meeting are actually quite high.
During the meeting there will be opportunities to ask appropriate questions that will allow you to create an impression in the other persons mind. It’s an indirect way to sell yourself and raise your profile. You could also use the meeting to find out more about hiring managers, the status of posted jobs and what departments are hiring. Use that information to setup follow up meetings.
If you do a really good job with these informal meetings, people will assume that you will only be on the market for a very short period. You need to keep having meetings so that people know your are still available. You can always say that your are still available because you are waiting for an opportunity that is a good fit. It’s important for employers to know that you are in control of the situation.
Eventually a hiring manger is going to ask one of your contacts if they know anyone that might fit a job opening that they have and your name will come up. When your name does come up, it will be with a recommendation from someone that the hiring manager trusts.
It only takes one recommendation to get a job, so be patient.
I was in a customer meeting last week with a product manager, salesman, customer and senior architect. They were describing a new product offering to a customer and all of the people involved were injecting “you know” into the conversation. The meeting went well and I suspect that everyone involved was unaware of the “you knows”, but I was really distracted. Each person had a different rate of insertions, but there were well over 200 “you knows” in the one hour meeting.
I have observed similar behaviour with other pause phrases like “ummmmm” so the actual content of the phrase is not that important. What is important is that a considerable amount of time and energy was spent relaying no information. I’m pretty sure that all the people involved did actually know.
As an outsider, who was for the most part silent, it was fascinating to watch. The exchange could have taken 20 – 30% less time and I feel that random pause phrases highlight the speakers inability to keep the conversation flowing with relevant information. It seems kind of unprofessional and implies a lack of preparation for the exchange.
Maybe I’m just getting old and crusty, but on the off chance that I am right, you could do what I do when I’m playing music. Record yourself and review how you speak. I believe that you will get much better results without the pause phrases. Leaving silent pauses will also allow other people to speak and add to the conversation.
There’s a song by Three Doors Down that refers to the singers life as containing things that are real and make believe. The Singer is struggling to tell the difference between the two. I suspect that everyone has moments like that in their life, but that’s a little too personal for a BLOG like this. In this case I would like to reflect on how this concept has played out over the years in the tech community.
Tech companies have always had to play a game where they need to keep investors engaged long enough to buy the time they need to build the product. Investors have become increasingly impatient and I blame the culture that has evolved from the Internet for that. In the 80’s I often found myself being asked to demo a product at a point where it was barely booting, while I listened to the sales person talk about how it was ready to ship.
At the time I didn’t understand the pressures that promote this kind of behaviour. Those pressures have not gone away, in fact they have gotten much worse. I have had people ask me what happened to the tech community in Ottawa. They reminisce about the tech boom of the 80s and 90s and question why that sort of thing is not still happening. They blame off shore investment and have a laundry list of others to blame.
If there is to be any blame, it should be focused on the investment community. If you want to build a product that is going to be innovative and produce a long term ROI, it is going to take Research as well as Development. The investment community seems to want to put money into development and avoid research. You can’t have it both ways. Some products need more than two years to develop and I would suggest to you that those products will produce the best returns.
So what do tech companies do when they can’t get funded to build those products? They sell their ideas to off shore companies that have investors that are willing to wait or they build a product that is Make Believe while they work furiously on the product that is real.
Is that not misleading the investors? Yup, but there are many new laws to guard against this kind of misdirection. Companies have to walk a much finer line.
So what is the answer then?
I believe that setting expectations up front with investors would be a good start but the real change has to happen in the investment community itself.
Why were companies like Nortel able to innovate for so many years?
Because they were considered a Blue Chip stock and people made long term investments. Nortel had an awesome track record, was honest with their investors and hired only the top graduates. But they failed in a messy cloud of controversy you say. To that I would reply, read the book 100 days.
It would be nice to not have to “Make Believe” and just stick with what’s “Real”. In order for that to happen though, investors are going to have to put in the time to find out what their investments are doing and accept some risk.
What is a reasonable amount of risk and how can I tell real from make believe? Excellent question and the topic of a future post.