A foreword by the author of this website
Hi, I am Matti Seikkula, Head of Spatial Services at Zag; New Zealand's premium partner of SAP, premium cloud provider and independent Geospatial Enterprise Information Integration & Management company.
The purpose of this website is to share some of my doodles on IT industry, specifically within Enterprise IS/GIS context. I wanted this site to be informative and fun - in the hope that I can get you to share on the enthusiasm I have in our industry.
and of course:
"The opinions expressed in this website are those of this author, and they do not reflect in any way those of the companies or groups to which I am affiliated."
The purpose of this website is to share some of my doodles on IT industry, specifically within Enterprise IS/GIS context. I wanted this site to be informative and fun - in the hope that I can get you to share on the enthusiasm I have in our industry.
and of course:
"The opinions expressed in this website are those of this author, and they do not reflect in any way those of the companies or groups to which I am affiliated."
Esri & SAP - Better Together? (22nd October 2019)
I am currently writing a series of 6 blog posts (bi-weekly/monthly) for my company Zag that deal with Spatial Industry, and on the Esri and SAP alliance. These are published in our company website (short versions), but I will also publish a longer version here on my blog.
Bimodal IT aka 2-Speed IT - BCD Approach (27th July 2019)
Bimodal IT is a term first keyed by Gartner to address resolving the problems caused by “Relentless Change of Technology” on the static traditional delivery of IT to Business – the need for Business to be able to respond quickly and in an agile manner to their users and customers, but still keeping the lights on for the all legacy and BAU systems within the organisation.
The way Gartner defines Bimodal IT is to have two (in all practices) separate delivery environments within IT platform that serve a set of specified systems and tools;
These delivery environments can be from one vendor or multiple vendors, deployed using the same or different product suites. Deployment and management methodologies in the modes can differ and so can the support mechanisms. Platform can be on-prem, hybrid or cloud and the two environments within the platform be deployed on any of those three. |
Looking at the other “mover and shaker” technologies and methods currently reshaping the IT world, we can see some real strong correlations to Cloud and Managed Infrastructure (outside-in) architectures – both promote the mechanism of separating your environment into an encapsulated environment which can be self-managed and monitored; first allowing the internal IT or even Business themselves on setting up their own environments and second getting a 3rd party to managed your hardware and software in a fully managed and supported environment.
At first, this sounds like a great idea – a “quick-wins” method for by-passing the complexities of trying to get the current environment to become the flexible, quick and proactive “beast” needed for rapidly changing business needs. However, there are some cave-ats that need to be considered;
This does not sound that good does it? So what is the right way forward then?
First of all, it seems that Bimodal IT is only a transitional stage within the IT delivery model; the end game would be for the traditional environment (mode 1) slowly shrinking and transforming to the agile environment (mode 2) in time. In the perfect world we would end up with only the agile mode, basically a new and improved best-of-both-worlds environment that allows for an automated safe, accurate and stable management of data as well as an agile, flexible, configurable and quick delivery and tools. In the real world there will always be some systems that cannot transcend to this mode – for example on the limitations of the product suite, system age and user specialisation; and it does not necessarily make sense to try to change these products, they are better to stay where they are.
Also I feel that the term Bimodal IT is not entirely correct – as it is actually a multi-modal environment that we need here – three or more modes of operation;
So instead of Bimodal IT I prefer using the term Business Capability Driven (BCD) approach as this
The core of BCD approach is to improve the current mode 1 as much as enabling the mode 2; Business customers and users get the “quick-win” advantages of an agile self-service environment and IT retains knowledge of this environment as well as has a road map to transcend the current environment to a more automated flexible future system. The key is the mode 3 – which is about Integration framework between modes 1 and modes 2 from the “Get-Go”. To enable this means there has to be a strong involvement with the IT on the creation and support of mode 2.
So how hard can it be?
Well, actually this can be reasonably hard, so let’s try to explain a potential transition a bit more using a diagram and an example user story; so let's consider the diagram first:
At first, this sounds like a great idea – a “quick-wins” method for by-passing the complexities of trying to get the current environment to become the flexible, quick and proactive “beast” needed for rapidly changing business needs. However, there are some cave-ats that need to be considered;
- Just having two modes like this will actually end up just creating silos in your organisations; fully independent systems that have little to no integration with rest of the mode 1 environment,
- These silos created can easily end up managing their own datasets, processes and workflows and provide a different type of user experience to the rest of the systems within the organisation,
- IT will not be interested on supporting these systems as they have no trust on the data within, and no understanding or control on the processes within,
- The mode 1 environment will be left in the “limbo” and will not gain any enhancements or improvements of potential future technology and advantages provided by hardware and software.
This does not sound that good does it? So what is the right way forward then?
First of all, it seems that Bimodal IT is only a transitional stage within the IT delivery model; the end game would be for the traditional environment (mode 1) slowly shrinking and transforming to the agile environment (mode 2) in time. In the perfect world we would end up with only the agile mode, basically a new and improved best-of-both-worlds environment that allows for an automated safe, accurate and stable management of data as well as an agile, flexible, configurable and quick delivery and tools. In the real world there will always be some systems that cannot transcend to this mode – for example on the limitations of the product suite, system age and user specialisation; and it does not necessarily make sense to try to change these products, they are better to stay where they are.
Also I feel that the term Bimodal IT is not entirely correct – as it is actually a multi-modal environment that we need here – three or more modes of operation;
- to cater for the integration between the two modes to make sure data and processes flow between two modes – data stored in one place only, but available for systems from both modes
- to cater for multiple different capability needs from business – there might not be need to create brand new additional environments (modes) for different business needs, but there will be need to provide for different user experiences, different data workflows etc. - a high variation within the mode
- to cater for innovation within the organisation – the mode might not necessarily be there for providing a production delivery support, often an innovation framework is needed that allows the organisation to try new things, providing them for their customers and retiring them if they do not work. For example, consider the Fail-Fast concept of an innovation “wall” – for organisation to be able to create a lot of innovative products and make them available (throw them on the wall), but only a handful of these products might become part of the organisations’ product stack (based on the usage statistics).
- To cater for other uses of the environment – for example the non-traditional use of APIs rather than systems.
So instead of Bimodal IT I prefer using the term Business Capability Driven (BCD) approach as this
- is an easier term for Business to understand
- defines multiple modes of operation based on the capabilities and needs defined by various Business entities.
The core of BCD approach is to improve the current mode 1 as much as enabling the mode 2; Business customers and users get the “quick-win” advantages of an agile self-service environment and IT retains knowledge of this environment as well as has a road map to transcend the current environment to a more automated flexible future system. The key is the mode 3 – which is about Integration framework between modes 1 and modes 2 from the “Get-Go”. To enable this means there has to be a strong involvement with the IT on the creation and support of mode 2.
So how hard can it be?
Well, actually this can be reasonably hard, so let’s try to explain a potential transition a bit more using a diagram and an example user story; so let's consider the diagram first:
On the left hand side is the traditional (Mode 1) IT delivery environment; the light-blue bubble depicts the size of the environment and inside the bubble are four areas of interest; automatic (green bubble) and manual (red bubble) processes for ETL, data maintenance, services etc., data (grey bubble) maintained (core and reference datasets) and web services (blue bubble) provided out to internal, external and public customers.
On the right hand side is the agile (Mode 2) IT delivery environment using the same legend for colour bubbles. It is expected that in the beginning the agile environment is quite a lot smaller than the traditional environment (size of the light-blue bubble). Another characteristics of the Mode 2 are small amount of data (smaller grey bubble) and most processes being automated rather than manual (smaller red bubble).
The uni/bi-directional arrow between environments is basically the integration environment (Mode 3); it manages data transfer and web service access between the two environments. In the first step the expectation is that there is little to no integration between Modes 1 and 2, only integration might be to allow Mode 2 users to access web service feeds provided from Mode 1 environment.
Note that the user interface (UI) is deemed to be outside of these environments, the expectation being that the traditional environment is serving users via desktop, web browser and mobile devices, as is the agile environment, but agile environment users also have access to web services provided from the traditional environment, e.g. base maps, Web Feature Services. And from user perspective, both these environments are transparent so user does not actually see two separate environments (grey line transparent polygon around modes 1-3), their access is via UI which handles the accesses to separate environments.
The first stage
The additional stages
Eventually agile environment should be fully automated, and on the traditional environment only the complex datasets that require manual processing will stay. The web services are accessible from agile to all users – essentially the agile environment will become the new master data repository. It is plausible that the traditional environment will be migrated to agile environment fully – that there will be no need to keep the traditional environment going at all.
User Story example:
On the right hand side is the agile (Mode 2) IT delivery environment using the same legend for colour bubbles. It is expected that in the beginning the agile environment is quite a lot smaller than the traditional environment (size of the light-blue bubble). Another characteristics of the Mode 2 are small amount of data (smaller grey bubble) and most processes being automated rather than manual (smaller red bubble).
The uni/bi-directional arrow between environments is basically the integration environment (Mode 3); it manages data transfer and web service access between the two environments. In the first step the expectation is that there is little to no integration between Modes 1 and 2, only integration might be to allow Mode 2 users to access web service feeds provided from Mode 1 environment.
Note that the user interface (UI) is deemed to be outside of these environments, the expectation being that the traditional environment is serving users via desktop, web browser and mobile devices, as is the agile environment, but agile environment users also have access to web services provided from the traditional environment, e.g. base maps, Web Feature Services. And from user perspective, both these environments are transparent so user does not actually see two separate environments (grey line transparent polygon around modes 1-3), their access is via UI which handles the accesses to separate environments.
The first stage
- will not change the size of the traditional environment,
- will introduce a lot of automation to the traditional environment (depicted as the red bubble shrinking, the green bubble growing) and
- will enable more web services for users (both Mode 1 and Mode 2 users) to access (like base maps).
- Will grow the data (grey bubble) and web services (blue bubble) in traditional environment as some of the more mature agile datasets now get stored as part of master data in traditional environment,
- will grow the size of the agile environment (depicted as the light-blue bubble growing),
- will introduce additional automation to the agile environment (depicted as the red bubble shrinking, the green bubble growing) and
- will grow both data and services (depicted as the blue and grey bubbles growing) within the agile environment – data as there will be more business data layers available and services as these will be enabled via SOA,
- will see integration (Mode 3 arrow) between two environments;
- all of the master data can be accessed as web services by all users,
- mature agile data will be migrated (as the official version) to the traditional environment implementing topology and spatial quality workflows and provided as web services to all users.
The additional stages
- will see the traditional environment to start shrinking as automated master data, processes and services start moving into the agile environment,
- will see the agile environment to continue growing as
- new business datasets, processes and services are introduced as well as
- data, processes and services from traditional environment are moved into agile environment,
- will see integration (Mode 3 arrow) to continue between two environments:
- fully automated master data is migrated from traditional environment to agile environment and can now be accessed by all users,
- rest of the master data can still be accessed as web services by all users,
- only business data that requires manual spatial topology and quality assessment and maintenance will be migrated (as the official version) to the traditional environment and provided as web services to all users.
Eventually agile environment should be fully automated, and on the traditional environment only the complex datasets that require manual processing will stay. The web services are accessible from agile to all users – essentially the agile environment will become the new master data repository. It is plausible that the traditional environment will be migrated to agile environment fully – that there will be no need to keep the traditional environment going at all.
User Story example:
Let’s consider a Local Government use case; a City Council’s Parks business unit as a specific user story we want to push through the Bimodal IT transition. Note that what we really want to achieve with this BCD Approach is for Council’s Business Units to become self-service; so they no longer requiring GIS team and IT to support and build their services for Public (the value-add data, web and mobile apps) – their customers. But at the same time IT needs to have confidence that Business Units will not break the council High Availability environment and that they will follow the best practices and rules set for all the council. And the GIS team needs to be confident that their hard-maintained spatial data is not broken, the standards are followed and the customers (Public) gets a unified user experience with Council-branded look-and-feel.
This can be achieved by the following overview steps:
Hope this made sense – it is a simple example on how one boundary could move between the environments, but the idea is basically as follows:
The BCD Approach has been implemented with some customers in New Zealand and has been a great success. These implementations have also proved that Mode 1 (traditional/BAU) really is just a transitionary mode – that it is possible for organisations to become fully Agile and self-service within their spatial environments. And if highly complicated and visual GIS technology can do it, then all IT should be able to.
So, there we go, as always, I am interested on your opinion – please let me know what you think?
This can be achieved by the following overview steps:
- Parks require their mobile field devices to be able to record and capture boundaries for the parks and work that is scheduled to happen in specific parts of parks. Internally these boundaries do not have to be topologically correct – they can overlap each other, so no need for complicated spatial business logic.
- In the beginning the new editable spatial parks layer will be introduced into Agile Mode 2 environment. Web services will be enabled for it and are accessible within Parks mobile app. The app uses the Traditional Mode 1 environment web service base maps.
- Stage 1 will migrate the official version of parks boundary to the traditional master table and apply spatial business rules to remove overlaps and slivers where viable. The reasoning is that for Parks layer to become an official boundary and exposed to public, it will need to be cleansed. Parks team will continue maintaining their own copy of parks boundary in the Agile environment and automated processes will pass through changes to the Traditional environment.
- Parks field mobile user can now also toggle on the official Parks layer, but can only edit their own Parks (staging) layer.
- Eventually in additional stages the parks layer will move back to Agile environment as part of group of similar layers that move as master dataset to the Agile environment. Same mobile user rules apply to Parks field mobile user, only difference is that the official version is switched on from within Agile environment rather than from Traditional environment.
Hope this made sense – it is a simple example on how one boundary could move between the environments, but the idea is basically as follows:
- In the beginning we will have two separate environments running side by side; master data on Traditional environment, business data on Agile environment,
- Stage 1 will start enabling some of the business datasets to become part of the master data by moving them from Agile environment to Traditional environment,
- Stage 2 and onwards will stop moving data from Agile environment to Traditional environment (unless the dataset requires a lot of manual corrections). Master data that has automated processes in place will start moving from Traditional environment to the Agile environment.
The BCD Approach has been implemented with some customers in New Zealand and has been a great success. These implementations have also proved that Mode 1 (traditional/BAU) really is just a transitionary mode – that it is possible for organisations to become fully Agile and self-service within their spatial environments. And if highly complicated and visual GIS technology can do it, then all IT should be able to.
So, there we go, as always, I am interested on your opinion – please let me know what you think?
Growing the Spatial Pie – Pro of the Year (10th December 2015)
I am so proud and grateful to have recently won the New Zealand Spatial Excellence Awards (NZSEA) – Professional of the Year award - especially as I do not have a background in Spatial, nor do I have official education in Spatial. Maybe this is a sign of the times – that Spatial is coming out of the backroom and into the mainstream!
Spatial is very relevant, is easily related to real things, is visual and exciting. It provides many benefits and real value for ALL business users. It is also complicated. n countries like New Zealand (NZ), the current market share of the spatial sector as a percentage of Gross Domestic Product (GDP) is typically around (or under) 1% (shown in the diagram below as the red slice). This is an average figure – in a country whose main export comes from Primary Industry, this number is actually a lot smaller, more in the vicinity of 0.6%, whereas countries exporting more IS (such as Canada and the UK) it tends to be around 1.2%.
Spatial is very relevant, is easily related to real things, is visual and exciting. It provides many benefits and real value for ALL business users. It is also complicated. n countries like New Zealand (NZ), the current market share of the spatial sector as a percentage of Gross Domestic Product (GDP) is typically around (or under) 1% (shown in the diagram below as the red slice). This is an average figure – in a country whose main export comes from Primary Industry, this number is actually a lot smaller, more in the vicinity of 0.6%, whereas countries exporting more IS (such as Canada and the UK) it tends to be around 1.2%.
Some countries however have been exporting IS and spatial somewhat longer. For example, in the United States (US) the market share against GDP is possibly as high as 8.7%, which is some 10-15 times higher than in New Zealand.
This highlights the enormous potential that exists for countries like NZ to grow our spatial market share. This can be done in two ways:
1) Supporting growing the IS pie
As there is a little bit of Spatial everywhere, by helping to boost exporting IS we can help grow the whole pie and as such increase Spatial market share within. As countries like US show us, we should be able to grow NZ Inc as much as 2-3 times its current size (shown in the diagram as the dotted yellow line).
2) Growing the spatial slice
Promoting Spatial everywhere will help us grow the Spatial market share within the pie - our slice of the pie. As countries like the US show us, we should be able to grow this share as much as 10-15 times its current size (shown in the diagram as green slice).
By supporting IS growth and growing the spatial component of this, the NZ spatial sector could potentially grow as much as 20-45 times our current size!
So how can we do our part in trying to achieve this tremendous potential? Below are a couple of things I have been doing. If we all work together to promote spatial, we could all achieve this growth for spatial in NZ.
1. Raising spatial awareness at an enterprise level – and this needs to be done with passion!
- Engaging with NZ enterprises to raise the awareness and value of spatial thereby widening its take-up and usage within NZ organisations.
- In the past, spatial has been considered a silo within the IT framework – a small part of an IT solution. Seeing spatial as being one of the keys for businesses wanting to become more client focused. Enjoying promoting its benefits to CEO’s and CIO’s.
2. Independent, outside-in/outside-out perspective helping businesses across all sectors and industries to understand the value of spatial as opposite to staying purely within Spatial industry (inside-in/inside-out).
- Having the knowledge, leadership and vision to help organisations achieve their objectives using spatial.
- Consulting with a large number of enterprise and government agencies on their business requirements and how spatial can help them achieve their objectives across all industries and sectors.
- Assessing organisations’ current spatial solutions, performing audits to ensure optimal performance of their investment, and advising on the degree to which the business is using spatial to its capacity, and its ability to meet future demands.
- Enjoying determining how to link or enhance an organisations current enterprise systems with spatial capability, irrespective of the amount of spatial capability they may currently have. This requires knowledge and experience of how IT, business and spatial link together.
3. Keeping up and promoting the Technology trends.
- Attending local and international conferences and seminars on a regular basis (Gartner, TechEd, CIO conference, Vendor conferences etc).
- Frequently meeting with customers and peers to demonstrate new innovations, technologies and trends in the spatial arena that may impact on their respective industries;
- Spending a lot of time weekly on reading about spatial, IT and business.
- Spending time with like-minded people around the world, brainstorming ideas.
- Setting up and chairing forums, steering groups and discussions regularly – helping peers and customers understand the spatial value proposition and providing insights into business issues and opportunities that spatial may be able to affect.
- Looking at the big picture, start recognising patterns and predict what might happen in the future. Lot of experience in IT will make this easier.
4. Contributing to the growth and maturity of the industry:
- Connecting people within and across organisations, speaking their language, unifying the market, and promoting the benefits of spatial.
- Enjoying networking and engaging at all levels of an organization promoting the benefits of spatial.
I’m always interested to hear from you, so please let me know what you think of this article by commenting in the forum page. Or maybe you have some more ideas on how to grow spatial pie – I would love to hear them! Or maybe my rambling did not make sense to you, please let me know of anything that was confusing or unclear.
And here is the winner photo from the awards :)
Analytics of Change – the Future of Where (30th August 2015)
I’ve been researching the different eras of computing, the generations of people and the IT companies producing and selling IT products to find trends and reasons for the disruptive major changes seemly happening right now within the spatial (IT) industry. It looks like there are some findings that are worth sharing with people – and some findings do seem to give us some guidance on why things are changing so much more now than previously.
The diagram below is the end result of this exercise, and first part of articles that I am planning to publish around this. This diagram is a busy one, so I’ll explain it in more detail below the diagram.
The diagram below is the end result of this exercise, and first part of articles that I am planning to publish around this. This diagram is a busy one, so I’ll explain it in more detail below the diagram.
The diagram depicts a time slice from early history of IT starting in the forties all the way to the future.
In the bottom of the diagram the yellow line depicts the three eras of Enterprise IT:
The next line on the diagram (from the bottom) is the grey line depicting the five eras of IT:
The next line on the diagram (from the bottom) is the multi-coloured line depicting the four human generations. There are two representations in the diagram for these – the definition based on birth dates and a transformed one adding 14 years to the birth to depict when these generations actually start appearing in the IT work force. This second line (on top of the black line with actual years) is actually for my analytics the important one as the generation influence to the IT industry cannot happen until the generation hits the workforce age:
Note: some researchers believe that the generations recycle after 4 generations, the Generation Y for example being in many ways similar to “Generation G.I.”.
The next part of the diagram are the company logos shown at the time of company’s conception. Position on the Y axis has no other relevance than that the spatial industry is shown closest to the X axis. The horizontal lines connected to the logo show how long it took the company to become successful from their conception. The rule of thumb seems to be that a company usually requires 5-10 years before it can establish itself as a successful company. It also looks like this time range is becoming shorter and is nowadays more like 3-5 years. One other thing to note is that there are several new types of businesses hitting the successful company status (over billion dollar annual revenue) in the last couple of years - companies like Uber, Waze, Ubisense, Mapcite and FourSquare.
Some companies like IBM and Apple have re-invented themselves several times and it looks like the success for longer-lasting companies is to re-invent themselves every couple of decades. I only included some re-invention lines within the diagram to give additional context as I did not deem them be integral part of the analytics. One other thing to note are the blue lines close to X axis; these depict the expected length of the IT product life cycle. In eighties a software product was expected to last some 10-15 years, at millennium the range had already halved and a decade later you only expected to get 1-3 years from your product before major upgrades or replacement was needed. Today the software cycle is less than a year – vendors release software versions typically around 3 annually and without constantly updating your software will lose its compatibility with the latest web browsers and mobile devices in around a year.
RESULTS:
The results of this analytics can be depicted with the vertical grey dotted lines at 1968, 1982, 1994, 2004 and 2014. These lines depict when major changes seem to be happening within IT (and Spatial) industry and we are on the middle of a major one right now. Firstly we can see that the time between these major change explosions was getting shorter – 14 years between 1968 and 1982, 12 years between 1982 and 1994, but has now stabilized to be around a decade. Secondly we can see that several starts have aligned with the change going on right now; IT Digitalization has started and so has The Nexus era, the Generation “Always-On” is starting to hit the workforce and the software release cycles have reached an all-time low. Thirdly we can see some brand new types of businesses are emerging and becoming very successful on what they do.
There is also additional evidence that I did not include on this diagram (I will be talking about these in detail on my later posts) around
So what we can derive from this analytics is that huge revolutionary changes are happening right now, and these changes are bigger than anything we have seen previously.
In the bottom of the diagram the yellow line depicts the three eras of Enterprise IT:
- IT Craftsmanship (1993-2001) was an era that emphasized the skills of the IT developers taking almost an engineering type of approach – customization was King and developer was omnipotent and knew better than anybody else what worked and what did not.
- IT Industrialization (2001-2014) was an era that emphasized the standardization of IT services through pre-designed and pre-configurable solutions (80-20 rule) thus providing for organisations that were willing to change their workflow to cater for these solutions. Automation was the King and the role of developers started changing to administrators and configurers.
- IT Digitalization (2014-…) is an era that pushes the IT solutions and services to changes in business models via digital technologies. This era emphasizes the importance of business needing to find new revenue and value-producing opportunities. Business is the King and role of IT and services is becoming OT (Operational Technology).
The next line on the diagram (from the bottom) is the grey line depicting the five eras of IT:
- Mainframe era (1943-1968) was an era that promoted centralized large-capacity computer systems that required highly skilled technicians to administer and program using languages like FORTRAN and COBOL. User interfaces consisted of dumb terminals and key codes used for manipulating the interfaces. A high-level of training was required even to the low level users.
- Workstation era (1968-1981) was an era that brought the computing power from mainframes to the work station. This was the rise of operating systems like UNIX and VMS with programming languages like C and later on Basic and Pascal. The era allowed small to medium size organisations to start producing their own IT solutions with the power of the desktops. The user interfaces were simplified and simple graphics were introduced.
- PC (Personal Computer) era (1981-1993) was an era that brought the work stations to masses – Macintosh and Windows rose from this era and allowed a wide variety of developers to start developing systems, games and user interfaces. Client-Server environments became a trend and GIS analysts finally got the tools to do high-level graphical mapping analytics. People started buying PCs also for home use, I bought my first PC (Intel 80286) when I was 18 at 1984 for Word Processing and games.
- Web era (1993-2011) was an era that finally started producing systems that did not require high computing power on desktops, but used HTML and JavaScript as well as plug-ins like FLEX, Silverlight and Java that did not require installation on the personal computers. In the way the dumb terminals were back, but the graphical interfaces were so sophisticated that you could not really see the difference between desktop system and a web browser system.
- Nexus era (2011-…) is an era that embraces the convergence and mutual reinforcement of social, mobility, cloud and information patterns. The user is the King and the goal is to drive new business scenarios via use of common patterns within the four “forces”.
The next line on the diagram (from the bottom) is the multi-coloured line depicting the four human generations. There are two representations in the diagram for these – the definition based on birth dates and a transformed one adding 14 years to the birth to depict when these generations actually start appearing in the IT work force. This second line (on top of the black line with actual years) is actually for my analytics the important one as the generation influence to the IT industry cannot happen until the generation hits the workforce age:
- Baby Boomers (1943-1961, workforce influence from 1957 & 1975 onwards) is a generation that introduced optimistic views on where world was going, sometimes even excessive consumerism and a belief that their generation was the first different (special) generation from those that came before. Baby Boomers had a great influence and involvement on Mainframe and Workstation eras, less so on PC era. Web era and especially Nexus era are quite foreign for Baby Boomers and require a significant effort from them to conceptualize.
- Generation X – “Gen X” (1961-1981, workforce influence from 1975 & 1995 onwards) is a generation that is highly educated and embraces change in the world and the human interactions. Gen X’ers have been often born in families with wealth to provide additional luxuries like computers and as such this generation has influenced the PC era and Web era as well as were the rising force behind the IT Craftsmanship era. Gen X tends to understand well IT Industrialization, but they are somewhat foreign to the changing world of the Nexus era and of the IT Digitalization.
- Generation Y - “Millennials” (1981-2001, workforce influence from 1995 & 2015 onward) is the generation that has also been described as “Generation Me” – with the traits of confidence and tolerance as well as sense of entitlement and narcissism. This generation is the first generation that can be described as digital natives – they are used to computational devices of all kinds, but there are also groups within millennials who do not care about digital technologies the same way, but are more tuned to human factors. Most current IT employees are part of the Millennials and they have a wide influence on IT Digitalization and defining The Nexus era.
- Generation Z - “Always On” (2001-2020, workforce influence from 2015 onward) is the first generation that is fully digital from birth to the grave; these digital natives will be able to juggle many tasks at the same time and actually use internet almost like an extension of their brain and as such will approach problems very differently to their elders (like using crowd-sourcing). But consequently they will also suffer from their need for instant gratification, lack of patience and inability for deep thinking; this could endanger our whole society with a generation of shallow consumers that do not provide any additional value to the growing internet of things and who do not easily recognize the difference between real information and noise. It is likely that this generation will boost the ubiquity, social media and consumerization into brand new industries and ways of using information.
Note: some researchers believe that the generations recycle after 4 generations, the Generation Y for example being in many ways similar to “Generation G.I.”.
The next part of the diagram are the company logos shown at the time of company’s conception. Position on the Y axis has no other relevance than that the spatial industry is shown closest to the X axis. The horizontal lines connected to the logo show how long it took the company to become successful from their conception. The rule of thumb seems to be that a company usually requires 5-10 years before it can establish itself as a successful company. It also looks like this time range is becoming shorter and is nowadays more like 3-5 years. One other thing to note is that there are several new types of businesses hitting the successful company status (over billion dollar annual revenue) in the last couple of years - companies like Uber, Waze, Ubisense, Mapcite and FourSquare.
Some companies like IBM and Apple have re-invented themselves several times and it looks like the success for longer-lasting companies is to re-invent themselves every couple of decades. I only included some re-invention lines within the diagram to give additional context as I did not deem them be integral part of the analytics. One other thing to note are the blue lines close to X axis; these depict the expected length of the IT product life cycle. In eighties a software product was expected to last some 10-15 years, at millennium the range had already halved and a decade later you only expected to get 1-3 years from your product before major upgrades or replacement was needed. Today the software cycle is less than a year – vendors release software versions typically around 3 annually and without constantly updating your software will lose its compatibility with the latest web browsers and mobile devices in around a year.
RESULTS:
The results of this analytics can be depicted with the vertical grey dotted lines at 1968, 1982, 1994, 2004 and 2014. These lines depict when major changes seem to be happening within IT (and Spatial) industry and we are on the middle of a major one right now. Firstly we can see that the time between these major change explosions was getting shorter – 14 years between 1968 and 1982, 12 years between 1982 and 1994, but has now stabilized to be around a decade. Secondly we can see that several starts have aligned with the change going on right now; IT Digitalization has started and so has The Nexus era, the Generation “Always-On” is starting to hit the workforce and the software release cycles have reached an all-time low. Thirdly we can see some brand new types of businesses are emerging and becoming very successful on what they do.
There is also additional evidence that I did not include on this diagram (I will be talking about these in detail on my later posts) around
- Relentless Change – the vendors are moving to the cloud, enabling even faster upgrades and versioning (daily?), but forcing customers to change their processes.
- Role of Business – is changing and taking over, IT is losing their budgets, the capability and resources to innovate and becoming BAU support only.
- Disruptive Technologies – new business types emerging, existing businesses dying off, costs are been pushed down, traditional employee roles are changing.
- Consumerization – business is driven from customer end, consumers are becoming new breed of vendors, 3rd Platform + Internet of Things and potentially huge savings for organisations via use of APIs and enabling for (external) Stealth/Shadow IT.
- Smart Machines – are now becoming businesses and the role of M2M will grow drastically.
- Human Factor – user interfaces will need to change to cater for millennials and gen-AO as these are a different breed (compared to us older folk).
So what we can derive from this analytics is that huge revolutionary changes are happening right now, and these changes are bigger than anything we have seen previously.
Some Thoughts on Enterprise Mobility (30th August 2014)
Mobility has become an important strategy for organisations. However it is still relatively immature compared to the rest of IS, especially around standards and frameworks. So in this fairly large blog post I wanted to clarify some common mobility strategies as well as include some short synopsises on the findings/comments from a geospatial perspective.
OVERVIEW
First of all, some context and acronyms to start with:
|
The relationships between the acronyms defined on left can best be described with the diagram below. You can also read the full blog post (where that diagram was included) here.
|
RELEVANCE
There is no doubt that MEM (MDM & MAM) is essential to all organisations as a future strategy for mobility. And that mobility itself is a strategic influencer area separate to standard IS - which it is clearly a part of – as it differs from IS in several fundamental ways; for example maturity level, toolsets & standards, UI and user experience and typically the applications built for mobility (and the user stories/groups mobility addresses) are quite different compared to browser/desktop apps.
However I seriously doubt the value of MEAP - or that I would even want to promote it as a future strategy for geospatially aware organisations (some 80% of you). One of the main reasons for this is that MEAP promotes a use of simple user configurable application templates for creating simple applications. The problem is that these templates are very specific and restrictive – so much so that it is arguable whether any kind of geospatial/mapping interfaces can be provided using them. And even if a mapping UI could be provided, it wouldn’t include the richness of features we are expecting today.
As far as MEM goes - there are not many MEM solutions out there yet, but as I mentioned before, all MDM solutions are becoming MEM, so for the rest of this post I will mix & match both MDM and MEM acronyms when needed.
There is no doubt that MEM (MDM & MAM) is essential to all organisations as a future strategy for mobility. And that mobility itself is a strategic influencer area separate to standard IS - which it is clearly a part of – as it differs from IS in several fundamental ways; for example maturity level, toolsets & standards, UI and user experience and typically the applications built for mobility (and the user stories/groups mobility addresses) are quite different compared to browser/desktop apps.
However I seriously doubt the value of MEAP - or that I would even want to promote it as a future strategy for geospatially aware organisations (some 80% of you). One of the main reasons for this is that MEAP promotes a use of simple user configurable application templates for creating simple applications. The problem is that these templates are very specific and restrictive – so much so that it is arguable whether any kind of geospatial/mapping interfaces can be provided using them. And even if a mapping UI could be provided, it wouldn’t include the richness of features we are expecting today.
As far as MEM goes - there are not many MEM solutions out there yet, but as I mentioned before, all MDM solutions are becoming MEM, so for the rest of this post I will mix & match both MDM and MEM acronyms when needed.
NEAR-FUTURE/NOW
Crystal ball-gazing into the near future has been done by many visionaries before. Here is what a local New Zealand visionary predicted we could expect to happen by year 2020 (which is only 6 years away!), and some comments I wanted to add into the mix:
Crystal ball-gazing into the near future has been done by many visionaries before. Here is what a local New Zealand visionary predicted we could expect to happen by year 2020 (which is only 6 years away!), and some comments I wanted to add into the mix:
- You can buy a capable smart device for $20 – agreed, prices are going down fast and will continue to do so replacing the traditional mobiles totally. Already in US some $50 provides you a “good enough” device (for most cases), and in India you can now buy this device (specs are pretty much same as Samsung S4) - see here.
- You can purchase unlimited mobile services for $20 per month – agreed, even in New Zealand this is bound to happen. In US you already get these from some providers for $19 per month.
- You can build an enterprise mobile app in 20 minutes – disagreed, but mainly as the term “build” to me is more than just configuring/developing/deploying a one form app to replace a paper based process; you also need to consider implications like business analysis, user requirements gathering, user stories, design across UI and user groups to often replicate potentially complicated legacy workflow/business rule driven applications. And because replacing a large enterprise system with something very different, additional project entities like documentation, training, full system testing, support, DR, data maintenance/ETL processes etc. will be needed. Note that simple “apps” have already been able to be built in 20 minutes for quite some time - see here.
- BYOD (Bring Your Own Device) will become mainstream – agreed, this is already happening and in alarming speed. Gartner prediction is that 2017 over half of world employers will require BYOD from their employees. And some statistics claim that BYOD by over 50% of employees is already a fact - see here.
- There will be a mass adoption of enterprise mobility – agreed, this will have to happen as silo approach and user/consumer tools available to configure apps rather than build by anybody will just result with an uncontrolled mess. However this adoption will have to happen as part of (and supported by) the standard enterprise IS strategies, not as silo platforms or processes. The current enterprise mobility technologies available today still feel (to me) quite immature, disconnected (siloed) and lacking process. There seems to be a lack of universal standards/practices – every vendor has their own processes/principles which might or might not comply with your organisations visions and IS strategies. What is even more alarming is that some statistics claim that by end of 2016 up to 90% of organisations will have developed some mobility apps – and without frameworks this is really asking for trouble.
GENERATION AO
Another interesting discussion subject around mobility (and especially around MEAP) that recently emerged is with the importance of Generation Always-On (AO) - see here – and how/why companies should within near future (5 years) try to attract the top/best/cleverest 20% of these for employees. And how difficult it would be without a good framework to do this. I do have a couple of problems with these kinds of statements:
Another interesting discussion subject around mobility (and especially around MEAP) that recently emerged is with the importance of Generation Always-On (AO) - see here – and how/why companies should within near future (5 years) try to attract the top/best/cleverest 20% of these for employees. And how difficult it would be without a good framework to do this. I do have a couple of problems with these kinds of statements:
- The first problem is that most organisations are not planning to attract the top 20% of these future employees. Only some very selected/specific organisations require ALL of their employees to be visionaries and inventors (top minds) - most organisations cannot provide tools and tasks that would keep these people’s minds entertained in daily basis.
- The second problem is that setting up an enterprise mobility platform (MEAP) today using the technologies (and templates) available today will also only be cool and attractive today, but not necessarily so in the future; the problem I see is that the application templates used within frameworks like MEAP are going to be targeted on building very simple and specific interfaces, which in couple of years’ time will look very dated. And I would even argue that these application template-built interfaces won’t be attractive to those top 20% minds nor probably to most of the Generation AO representatives today or in near future.
WEB vs NATIVE vs HYBRID
On to another interesting discussion item then - whether you should build web apps, native apps or hybrid apps for mobility. Most enterprise architects I’ve talked with seem to believe Hybrid apps are the way to go and because of this are as such enabled within strategies like MEAP. Some of the key reasons why are:
On to another interesting discussion item then - whether you should build web apps, native apps or hybrid apps for mobility. Most enterprise architects I’ve talked with seem to believe Hybrid apps are the way to go and because of this are as such enabled within strategies like MEAP. Some of the key reasons why are:
The diagram on the right defines this quite well: |
You can also read the original blog post/article here.
|
GEOSPATIAL HYBRID
Although I agree that in the perfect world hybrid apps would be a great way to go, I do have to disagree with the need for geospatial hybrid apps (using application templates) - the problems as I see are:
Although I agree that in the perfect world hybrid apps would be a great way to go, I do have to disagree with the need for geospatial hybrid apps (using application templates) - the problems as I see are:
- Geospatial web apps interfaces, using APIs and SDKs, are improved constantly – the gap to native apps interfaces is closing rapidly. The APIs are also getting more sophisticated and as such are making it easier and faster to build solutions with.
- Geospatial native apps SDKs are becoming more similar, for example Esri provides SDKs for Android, iOS and Windows Phone that all are services based, with same interfaces and fundamentally similar structure and functionality. This means you would still need to build 3 separate apps, but they would be almost exactly the same, so there is only need to solve the problems/interfaces once and then clone the other 2. It is a lot faster and easier to build native apps and so the gap to multi-platform supported solutions is closing rapidly.
- As both native apps and web apps interfaces are rapidly closing the productivity gap, why would you then invest on a hybrid solution knowing that in a couple of years’ time hybrid apps might be discontinued and an irrelevant technology? Combine this with the application template limitations already discussed earlier on this post – and it sounds to me like investing on learning the hybrid technology is just not worth it.
ONESIES & OMOMs
Another interesting concept I heard one of the mobile framework vendors to introduce was the term for “onesy” apps – simple mobility apps that follow these simple rules:
Another interesting concept I heard one of the mobile framework vendors to introduce was the term for “onesy” apps – simple mobility apps that follow these simple rules:
- One thumb, user interface is easy enough to use with one thumb.
- One screen, there will be only one form/screen to interact, no need to go through several screens/forms of information.
- One minute, the app is only used for a short period of time – either to quickly record information or on a query result (flow-through) with links to follow to some other app/additional information.
- One app, used always for this type of interaction or accessing this type of information, there is no duplication.
- One mission, single topic simplified specialised interface, small footprint – great performance, no training required.
QUICK-WINS
How to gain some quick-wins then within Mobile technology future stage enablement? One approach is SAMR (Substitution Augmentation Modification Redefinition) model that provides some simple overview steps for organisations to achieve their mobility future stage:
How to gain some quick-wins then within Mobile technology future stage enablement? One approach is SAMR (Substitution Augmentation Modification Redefinition) model that provides some simple overview steps for organisations to achieve their mobility future stage:
- Substitution; replace current paper-based, legacy or manual tools and processes directly (like-for-like) with no functional change.
- Augmentation; use technology advances as direct tool/process substitute, but cater for functional improvements – usually incorporated as additional steps on processes/tools.
- Modification; redesign tasks and workflow – usually a significant rewrite requiring a lot of design and analysis.
- Redefinition; allows organisation to create and perform new tasks that were considered previously inconceivable.
- Transformation; transform the way organisation works, change personnel’s roles & job functions, create some new job functions, retire some old job functions – TRUE transformation for the organisation from legacy to mobile.
WINNERS & LOSERS
As mentioned I do believe MEM is fundamental to organisations going forward. So what are the current MEM providers and which ones are the leaders – Gartner has diagrams to showcase this:
As mentioned I do believe MEM is fundamental to organisations going forward. So what are the current MEM providers and which ones are the leaders – Gartner has diagrams to showcase this:
First thing to note is that in mid-2013 (year ago - diagram on left) the term Mobile Device Management was used - a year later (June 2014 - diagram on right) the term has been changed to Enterprise Mobility Management - this is how fast this area moves and how important (enterprise-wide) mobility is becoming. The comparison article can be found in here.
The leaders are on the top-right-hand corner. The ones to watch for are on bottom-right-hand corner. Challengers on top-left and the why-should-you-care players on bottom-left. The only notable changes are SAP losing ground and IBM gaining ground. Other than that this year's picture is very similar to last year par the fact that twice as much dollars are spent on mobility management today than last year..
Note that since Gartner provided this (May 2013 – some 14 months ago) there are now 132 vendors out there providing some kind of MDM/MEM solution. Note also that from New Zealand perspective there are some potentially significant players missing from that picture like Microsoft's Windows Intune - there are no statistics on how well that is doing, but as MS is so strong in NZ, you have to wonder.
The leaders are on the top-right-hand corner. The ones to watch for are on bottom-right-hand corner. Challengers on top-left and the why-should-you-care players on bottom-left. The only notable changes are SAP losing ground and IBM gaining ground. Other than that this year's picture is very similar to last year par the fact that twice as much dollars are spent on mobility management today than last year..
Note that since Gartner provided this (May 2013 – some 14 months ago) there are now 132 vendors out there providing some kind of MDM/MEM solution. Note also that from New Zealand perspective there are some potentially significant players missing from that picture like Microsoft's Windows Intune - there are no statistics on how well that is doing, but as MS is so strong in NZ, you have to wonder.
IMPLEMENTATION
For an organisation to be able to decide on what MEM to use, it is much more important to investigate the 3 fundamental ways a MDM can be implemented:
For an organisation to be able to decide on what MEM to use, it is much more important to investigate the 3 fundamental ways a MDM can be implemented:
- Profiles based – an app is uploaded into the mobile devices and it takes care of security, accessibility and use of any business apps. App can be upgraded automatically or via business process controls through the server software. The app is running on memory of the device and interacts whenever a business app is launched. The biggest downside is that users can deactivate, uninstall or delete this app and still access the business apps. The only thing the corporate server software can do in this case is to spam email the device constantly to get the app back up and running. Could potentially provide for offline capability, but with the risk of having to trust the users (as offline cannot authenticate). Software vendors using this type of deployment strategy include solutions like AirWatch and MobileIron.
- Containerisation – a secure container (folder) is dedicated within the devices that includes all the business apps. The apps can only be used when the user has logged in to the company network (authentication). This is the most secure strategy, however the downside is that apps need to be built using the sandbox templates and business rules defined. This can restrict the solutions to using specific (certified) products and add complexity to building bespoke solutions. It is also doubtful that offline capability could be provided with this type of strategy. Software vendors using this type of deployment strategy include solutions like Good Technology and BlackBerry.
- Virtualisation – this strategy requires no authentication software on the device, the apps run online and server end decides what apps are available to what users. Solution is very secure, and will enable for widest variety of apps (products) and no additional business rules for bespoke solutions. This type does not provide for offline and could also be quite network intensive. One downside is that the server software does not recognise jail-broken/rooted devices (approximately 5% of iOS/Android devices). Software vendors using this type of deployment strategy include solutions like Citrix, VMWare and Windows Intune.
GEOSPATIAL CONSIDERATIONS
So how do these fundamental strategies stack up for geospatial build and deployment of enterprise level mobility?
So how do these fundamental strategies stack up for geospatial build and deployment of enterprise level mobility?
- Security considerations – Strategy for Containerisation is the safest bet, strategy for Virtualisation being a close second.
- Geospatial Native app build capability - Strategies for Profiles Based and Virtualisation are great for developing geospatial native apps as there is no need to code anything special with the apps. Strategy for Containerisation restricts the build of native apps.
- Geospatial Web app build capability – Strategy for Virtualisation is great for building geospatial web apps as authentication happens within server environment. Strategies for Profiles based and Containerisation are more suitable for building native apps.
- Offline capability – Strategy for Virtualisation is no good for offline apps. Strategy for Profiles based has limitations for offline as authentication requires online connectivity. Strategy for Containerisation will support offline connectivity easily.
You can see 6 archived stories here.