I often get asked for recommendations for team collaboration tools. The advice I give seems overly simplistic to me, but since I get enough compliments about it, I’ll share it here.
I could expound ad nausea on this topic, but I’ll attempt to be brief and offer 3 key steps in selecting a tool or tools:
There’s a plethora of options, most of which overlap (but few play well together), which unfortunately makes things more complicated.
First, determine what problem(s) you’re trying to solve. Do you want to improve communications between engineering and marketing? Perhaps you need a better way for new employees to find internal answers more quickly. Or maybe you need better ways to communicate across geographies and time zones, but everyone’s email is overflowing.
Next, think of the givens that you can’t change. For example, perhaps your company management uses PowerPoint religiously for presentations and finance folks use Excel. If the engineers use Google docs, then you may have to think about whether you need to use both, or which constituents could more easily switch over. The choice of desktop productivity tools could have a real impact on what collaboration tools you use. And if knowledge management is a goal, consolidating formats and repositories will make it easier to build an knowledge repository.
Then, take small steps. Don’t rush out and sign-up for a whole bunch of tools because you think you’ll need one in each category. For example, if you solve doc collaboration, then you may find it better to use a shared spreadsheet for task management than a dedicated and unintegrated point solution.
Then, you can start to narrow down choices. They all have pros & cons, and everyone will have subjective preferences.
For example, at WaveMaker, we needed to solve geo and time zone spread of team and we use MS Office for most of our docs. Thus, we went with Yammer (as better tool than email) but it’s weak for doc collaboration, so we also use online SharePoint (because MS Office doc collab is key and Yammer is now owned by Microsoft, even though still poorly integrated). We use Salesforce.com for CRM, but found Chatter also lacking at doc collab, which was more important to us than CRM integration.
I hope that’s helpful.
After upgrading to Windows 10, I discovered a bunch of shortcomings…
10. OneDrive placeholders no longer supported. What? How do you now use that 1 TB cloud storage you planned on using as virtual storage for your MS Surface?
9. Search in Windows 10 Start is slower
8. Battery system tray icon no longer lets you switch power plans. It’s now embedded several clicks in the UI.
7. All of a sudden, my monitor overscanning stopped working.
6. If you set OneDrive to not start upon Windows 10 startup, good luck figuring out how to start OneDrive manually (hint: look for OneDrive.exe in Programs)
5. Edge is Win 10’s new browser, but it doesn’t support all SharePoint capabilities like “Open with Explorer”, and guess what…you can’t install another IE version (clever IE download site confirms you already have the latest version)!
4. OneDrive folder selection won’t let you proceed unless it thinks you have enough free disk space, even though existing disk space is used by previously-synced OneDrive files.
3. Media Center is gone from Windows 10. No longer available.
2. Mounted VHD in Windows 10 keeps randomly unmounting (worked for years with Win 7 – 8.1), and then OneDrive cannot complete synching if the mounted drive is not available long enough.
And the #1 reason why I should not have upgraded to Windows 10 is….
1. Where’s the benefit? I see zero advantages over Windows 8.1, only regression.
The good news (if you upgraded) is that you do have 30 days to revert back to previous version of Windows (7-8.1). Click on Start > Settings > Update & Security > Recovery > Go back to Windows xx.
Docker was the #2 “best overall open source cloud project” according to a survey by The Linux Foundation and The New Stack in July 2014. Google Trends sheds some light, though, on the relatively new entrance and rapid acceleration of Docker versus OpenStack and “Virtualization”.
Google Trends - March 5, 2015
Software used to have a major delivery drag – it had to be delivered on some physical medium like tape, disk or download and then installed. Since software could only be consumed so quickly, the impetus of independent software vendors (ISVs) to build and deliver new software was perhaps at an annual or semi-annual basis. SaaS has changed all of that. Now, there’s no reason why a bug cannot be fixed and delivered…instantly, or at least as “fast as possible.”
Web scale SaaS companies (a la Facebook, Netflix, and a big slew of smaller SaaS companies) have been striving to optimize their software release stream to do just that. These modern ISVs have given rise to “DevOps” teams that focus on this task of “Continuous Delivery”. This has given rise to scripting technologies like Puppet and Chef for automating release cycles.
The problem with these approaches is they are still prone to error. Creating a test environment via scripts is not 100% guaranteed to build an exact replica of the development environment, for example. So when QA finds a bug that is not reproducible in the development environment, valuable time is wasted determining if the bug is in the software or the environments.
Docker changes this paradigm. Instead of pulling your hair out to recreate “identical” environments, Docker gets much closer to actually just moving the actual environment around. This eliminates the errors due to release environment differences and the need to build and maintain lots of complex scripts.
Docker provides additional benefits to SaaS ISVs as well. Docker is very light weight, and fast, which makes it easy to scale, resource efficient, and for new cloud ISVs that may not be invested in technology like VMware, Docker providers many of the classic benefits of virtualization.
Understandably, large scale SaaS ISVs lead the way in innovation, including Docker implementation. Enterprise developers learn from these best practices and are starting to adopt Docker. Enterprises, too, benefit from the speed and efficiency of continuous delivery (CD), especially those enterprises building customer-facing scaled SaaS applications. But there’s an even larger number of internal applications that perhaps don’t need the daily updates, but certainly need greater efficiency delivered by Docker on internal teams with limited resources.
What enterprise IT may not realize is that if they do not provide Docker for their developers, their developers have numerous external choices now for public hosted Docker services. In many cases, IT does not want their apps, workloads and data running on unsanctioned services, and the real problem is they probably won’t know it’s happening until well after the fact.
Forward thinking IT are providing Docker for their developers, and those furthest along are discovering additional benefits of Docker ranging from lower costs via greater resource utilization or reduced VMware licensing costs to workload portability for moving apps to/from public and private cloud infrastructures.
But what enterprise IT departments are also realizing is that Docker is not so simple to implement, especially given the unique needs of an enterprise. Docker is not an out of the box solution. To implement Docker, you must not only manage containers and orchestration, but ensure resource isolation and access control for security; streamline diverse stack support and upgrades; optimize data snapshotting, backups and recovery; implement monitoring at machine, instance, container and workload levels; and integrate with existing systems like LDAP or Active Directory.
SaaS ISVs need to scale a single application, on a single app stack, on a single infrastructure, and they have engineering resources dedicated to implementing Docker and all requisite components. Enterprises, on the other hand, have diverse workloads, varied app stacks, heterogeneous infrastructures, limited resources, and additional needs for security, control, visibility, governance and compliance, so implementing all of the Docker related technologies for these permutations can be daunting.
That’s were PaaS comes in. A Docker-optimized PaaS for enterprise should take care of implementing Docker and required components, while meeting additional needs like role based access control based on existing IDM with visibility and monitoring. A well designed PaaS will also leverage the unique advantages of Docker to deliver features like idle workload hibernation for further infrastructure cost savings.
Docker is young in the timeline of enterprise technology, but we’ve rarely seen one grow and rise so quickly. Enterprises will lag slightly behind SaaS ISVs in Docker adoption, but the smart IT organizations will not only provide Docker to their developers before they wander elsewhere, but will also discover the other powerful advantages of Docker unique to enterprises without spending an arm and a leg.
It’s hard to know with what to be more impressed — the watch, the technology, the design, the demo video, even the product videography that went into the video is amazing. Apple still has it after Jobs — albeit, clearly, this product has been in development for years. What I find myself wondering, is how many things will this disrupt? I cringe for watchmakers (which have taken a beating from smartphones already) and fitness specialty watches. Will this cannibalize, a bit, even smart phone usage? With payment, what will I do with my Coin, which I haven’t even received yet, let alone the phone payment options? Although…you still need an iPhone to use this iWatch. “Unfortunately”, I have an Android device, so I find myself wondering, who will/could develop a device even close this cool? Samsung? HTC? Well, maybe some cobbled subset given enough time. Until then, I live with my Ironman and Tissot trusted timepieces, (who uses those anymore?) The “iWatch” looks pretty cool, but I did buy an Apple Newton back in the 90’s. We’ll see.
Here’s a link to the webcast I did on BrightTalk this morning. Docker is hot. APIs are ubiquitous. aPaaS is finally gaining momentum. And Enterprises are facing increasing business challenges and complexity. How can these trends and technologies help? How does RAADD (Rapid API Application Development and Deployment) foster innovation and agility? How does Docker and Containerization really help optimize app workloads? Find out with Samir Ghosh, CEO of WaveMaker, as he gives you the end-to-end view of Docker aPaaS and talks about steps companies can take to effectively prepare for and leverage these trends and technologies. I hope you’ll give it a high rating on BrightTalk if you like it. Thanks!
Within a business, the collective processes can be viewed as a spectrum. The more repetitive, team-oriented and generic, the more likely there will be a packaged line of business (LOB) application, like CRM, ERP, etc. More proprietary processes, those linked to the business’s competitive advantage, will likely need BPM solutions or custom applications. However, if a proprietary process is not highly repeatable, or does not involve many people, individuals tend to choose the tools used – email, spreadsheets, chat, etc.
Social Collaboration fills the gap where processes are too proprietary and ad hoc (different enough each time) to warrant a custom or BPM solution, but also too collaborative or team-oriented for email to be an efficient solution.
By leveraging Social Collaboration, these important “long-tail” processes can be also tamed by capturing and sharing learnings, providing transparency for broader contributions, enabling faster responses via self-subscription activity notifications, etc. These benefits are lost when relying on email, chat, phone and meetings to fill this gap alone.
Forgive me if I’m stating something obvious. I’m just surprised how often I see people (in all kinds of situations) make recommendations without providing any rationale behind their recommendations. IMHO, this makes recommendations, and all the hard work and thought that went into them, virtually meaningless. Always frame the problem first and start with the primary objective.
Marquee consulting firm McKinsey has an approach to problem framing called “MECE“. This refers to “Mutually Exclusive and Collectively Exhaustive.” The value is more in the philosophy than a specific approach. Implementation can be done in many ways (outlines, graphics, mindmaps, etc.). The key is achieving the MECE framing objective.
Gutenberg’s printing press initiated mass knowledge capture via books. The Internet is enabling another monumental knowledge boon, but this is not limited to just web pages. In fact, rapidly increasing pace, more multi-tasking, greater agility, and even shortening attention spans are moving us away from the era of large, heavy books or documents to an increasing amount of micro-content. Major decisions and valuable knowledge can be found in micro-content including email, microblogging, and SMS mobile texting. The question is: how much of it is useful knowledge and how much creates noise that just makes finding the useful stuff more challenging?
The above chart illustrates concepts and is not intended to indicate relative magnitudes in any dimension. However, it may be interesting to think about which collaborative interactions should be prioritized for knowledge retention (i.e., most valuable knowledge). Does the interaction medium effect the likelihood of useful information? or the number of people involved in the interaction? For example, do more real-time media lend to less pre-thought posts, perhaps resulting in less useful info early in a chat conversation, but more valuable conclusions worth saving at the end? In contrast, an initial blog post has considerable thought (well, some do at least) up front, followed by less formal comments by readers. The bottom line is we don’t know, so all if it is kept for future retroactive analysis, which leaves us in the burgeoning age of big data, with snowballing growth of lots of little data (micro-content) instead of traditional heavy data (big books and documents).