Overview: In an economy where a company's business network of suppliers, distributors, partners, and customers is an increasingly important source of competitive advantage, semantic interoperability - the ability of human and automated agents to coordinate their functioning based on a shared understanding of the data that flows among them - is a major economic enabler. Consequently, semantic interoperability problems drive up integration costs across industry.
An integration analyst takes on the task of mapping one concrete data structure to another, which has to be done all too frequently when integrating systems. The data structures that the analyst must map are often lengthy and complicated, containing many data elements. Current state-of-the-art data mapping tools display both data structures on the screen and allow the analyst to graphically draw connections and write expressions to specify what should map to what according to what rules. These tools take the graphical map as input and produce an executable transformation, in keeping with a model-driven approach that is a genuine advance over having to write transformation programs in lower-level code as per the predominant practice of a decade ago. Thus, once the analyst has figured out what the mapping should be and has entered the mapping into the tool, the tool does useful things. However, it is very time consuming and error prone for the analyst to figure out what should map to what. The integration tools lack the ability to help the analyst figure out what the mapping should be. Consequently, bottlenecks delay project implementations and subtle mistakes occur in data transformations used to integrate systems, costing the involved parties serious money. Even if the analyst has access to good documentation of the data structures, their size and complexity means the process is fraught with opportunities for delays and mistakes.
In sum, once the analyst has decided what should map to what, current-generation data integration tools simplify the mechanics of describing the mapping and getting it into executable form, and that is really a big help, but it's not sufficient, because if the analyst makes the wrong decision then the tools will simply make it possible to execute the wrong decision quickly. This is the essence of the semantic interoperability problem. The semantic interoperability problem not only causes delays and errors; it also stops projects from happening altogether. Parties often shy away from initiating a project that could add value in a market segment because high up-front integration costs stretch the return-on-investment horizon so far in the future that the project is not economically viable.
Why should you attend: The semantic interoperability problem is so large that it can be difficult to know where to begin when seeking to mitigate it. CIOs generally estimate that integration costs eat up 30 to 60 percent of their IT budgets. The good news is that the size of the problem means that we don't have to solve it completely in order to have substantial positive impact. Visa conducted an econometric study that suggested that modest improvements in semantic interoperability have the potential to measurably increase global GDP.
This Virtual Seminar describes techniques for improving semantic interoperability that are starting to make their way into industry. It places an emphasis on the approaches that are relatively simple to implement and yet provide real business value. It also places those first steps on a path whose next steps are also incremental yet valuable. This Virtual Seminar is a more detailed, more technically-oriented version of the Webinar entitled "Data Integration and the Semantic Interoperability Barrier: Business Impact and Overview of Practical Solutions."
Areas Covered in the Session:
The current state of the art in data integration - what we have achieved, and what remains to be tackled
The business impact of the lack of semantic interoperability on data integration complexity and costs
Keeping it simple: Identifying the approaches that are most achievable in the short to medium term and yet provide substantial benefit
An in-depth look at new techniques for improving semantic interoperability, leveraging international standards. This includes an examination of the power and limitations of the Semantic Web in attacking the semantic interoperability problem, with a particular focus on the tradeoffs between OWL and SKOS, two key Semantic Web languages used for building ontologies and vocabularies.
A detailed look at an innovative kind of metadata called semantic metadata, which helps to make semantic vocabularies and ontologies actionable by forming a crucial bridge between IT elements and elements of the vocabularies and ontologies. This drill-down includes a close look at rigorous modeling techniques used to incorporate semantic metadata into tools, and explores how semantic metadata leverages the ISO 11179 standard metadata standard.
A detailed view of how the new techniques are being incorporated into key finance and business reporting standards to which the presenter has been a major contributor
A walk through the available options - and the trade-offs among them - for avoiding unnecessary disruption to the traditional data modeling process when applying the new techniques
Special considerations when applying the techniques to service-oriented architectures
General considerations for applying the techniques to big data
Who Will Benefit:
Chief Data Officers
David Frankel has over 30 years of experience as a programmer, architect, and technical strategist. He is recognized as a pioneer and international authority on the subject of model-driven systems and semantic information modeling. He has a wealth of experience driving companies and industry at large to successfully adopt strategic technologies.
He has an outstanding ability to communicate in technical mentoring situations, publications, and presentations. He has published two books and dozens of trade press articles, and has presented at many industry conferences. He has a reputation for facilitating collaboration that, along with his strong technical expertise, has led him to co-author a number of important industry software standards, including UML®, ISO 20022, BIAN, and the XBRL Abstract Model. He served terms as an elected member of the Object Management Group’s Architecture Board and Board of Directors.
The IT domains in which he has expertise include data integration, domain-specific languages, enterprise architecture, model-driven systems, semantic information modeling, semantic interoperability, service-oriented architecture, and software product lines. The business domains in which he has applied his technical expertise in recent years include financial services, ERP financials, and business reporting.
His most recent project involved setting up the service-oriented architecture and tooling framework for the Banking Industry Architecture Network (BIAN) standards organization, during his tenure as SAP’s Lead Standards Architect for Model-Driven Systems.
Overview: This Webinar will review some techniques that have worked in building an effective EA Program and will provide advice on some key activities to focus on and others to avoid.
Today's Business/IT Environment for medium and large corporations can often be characterized as:
An IT environment that is highly complex & costly - organizations were spending more and more money extending, enhancing, and maintaining existing IT systems and services
Ineffective business alignment - organizations were finding it more and more difficult to keep their increasingly expensive IT systems aligned with business needs and drive the level of innovation required to grow the business
Bottom line is spending effectiveness and providing agile technology services
Many organizations have attempted to build an Enterprise Architecture Team to address these issues. Most have failed to deliver what the business wants – an effective balance of spending control while adding flexibility and the ability to innovate & grow the business as needed
The pace of change and opportunities are increasing with no end in site
First to market is a growing mantra in many businesses today
An effective EA Program should play a leadership role in addressing these needs & wants. An effective Enterprise Architecture Program addresses these areas. Some of the benefits of an effective EA Program include:
Reduced cost to maintain existing systems by reducing the number of technologies in operation (lower the cost of maintenance and training) as well as increasing the utilization of IT assets
Reduced time to bring new capabilities to market by enabling the re-use of Business and IT services
Increased effectiveness of IT resources (capacity planning, solution teams) including increased efficiency of operation (design for change)
Increased effectiveness of Business resources by providing information and support for decision making (triggers points) and enabling impact analysis (reduce technology risk)
Reduce the all-to-often "firedrills" that divert focus from business imperatives
Other Business Benefits possible include:
Improve customer intimacy
Improve Corporate "agility"
Focus on business-relevant investments (high ROI)
Digital innovation support
New products, compressed Supply Chains, partnerships
Other IT organization benefits can include:
Increase speed (and lower cost) to deliver through reuse
Focus attention on high impact business needs
Become a strategic ally not a "controlled expense"
Become an "Innovation Partner"
There is no formula for building an effective Enterprise Architecture Program.
Areas Covered in the Session:
What is EA?
EA provides the bridge between business strategies and IT implementation
Enterprise Architecture is all about managing change
Enterprise Architecture is a Process and a Thing
Many enterprise-architectural methodologies have come and gone in the last 20 years. We will discuss some of the more popular ones.
EA programs must address the components:
Enterprise Business Architecture
Enterprise Information Architecture
Enterprise Application Portfolio
Enterprise Technical Architecture
Key characteristics of effective EA Programs
Strong IT and business sponsorship
Clear goals – practical, achievable, and measurable
EA is at the heart of the IT function
Strong EA Team leadership, communication, and facilitation skills
Results are measurable; EA will drive key Business and IT metrics
Lower cost, higher service levels, faster implementation times, etc.
Not all metrics or goals can be achieved simultaneously
Business Leaders have a "feeling of Trust" for the EA Team
EA must earn a "seat at the table" for strategic discussions and due diligence
Who Will Benefit:
Business Leaders that want more from their IT Organization
Greg Kohl is a Sr. Principle Consultant with Trexin Consulting. He has a strong technical background coupled with significant experience working closely with business leaders driving both application and infrastructure areas. Greg has demonstrated success in creating Enterprise Architecture Programs at Honeywell International and Kelly Services and providing advice to several Trexin clients.
Greg has strong IT management experience from his P&L leadership of both consulting organizations and internal IT strategic leadership positions. He has demonstrated experience bridging business and technology for clients around the globe and leading strategy and IT business planning projects. Greg has driven infrastructure consolidation efforts that resulted in significant cost savings. His IT background spans software development (Operating Systems through Applications), systems integration, Data Center rationalization, DR, Enterprise Architecture as well as the technologies of servers, networking, and the Internet.