Documentation

No results
    gitHub

    Vision

    Hackolade Studio, like other legacy tools in the same segment, uses Entity-Relationship Diagrams to visualize and maintain data models.  It also includes graph diagrams to represent concepts and relationships, tree views of hierarchical structures, and grid views.  But our next-gen tool takes a modern approach to data modeling as it does not carry the tech debt of legacy tools built decades ago strictly for relational databases.  We embrace 21st-century technology, cloud databases, event-driven architecture patterns, data-in-motion, Git repositories, digital transformation, application modernization, ML/AI, Domain-Driven Design, and much more.

     

    Our mission is to help tackle the complexity at the heart of data, and to reconcile Business and IT through a shared understanding of the context and meaning of data.  This is achieved through the pillars of our vision: Polyglot Data Modeling and Metadata-as-Code, which are best executed through Domain-Driven Data Modeling.

     

    tldr;

    If you want your organization to be data-driven, many planets need to be aligned.  In particular, you must master new and complex technologies.  And your IT department and the business must be working hand-in-hand.  In particular, there must be full alignment on the interpretation of data.

     

    In recent years, Chief Data Officer (CDO) positions have been created in many organizations, and big budgets have been invested in data catalogs, schema registries, as well as solutions for metadata management, data quality and data governance.  Are these solutions achieving the objectives?  Do all parties share the same interpretation of the data?  Is there a truly single source-of-truth for metadata?

     

    With Hackolade Studio we bring back pragmatic data modeling to facilitate the alignment of business and IT through 3 fundamental capabilities and 1 methodology detailed in the following pages.

     

     

    Why is data modeling more important than ever before?  

    The risk, if we lose sight of what data actually means, is that we end up with hallucinations (in the case of AI) and bad decisions (in the case of analytics.)

     

    For years, big data has been accumulated without thinking first about how it would be used.  It has been a code-first and store-first fest which inevitably lead to endless efforts to curate data after the fact.  Whereas it has been proven time and time again that a design-first approach leads to much higher quality, productivity, and insight.  It is time for business and IT to work hand-in-hand and design together how to store data, agree on its meaning, and make it truly valuable.

     

    In an age of Machine Learning, Artificial Intelligence, data pipelines, data democratization, and self-service analytics, it becomes even more important to perform data modeling because data models standardize information, enable interoperability, show intent, determine trust, and ensure proper data governance through schema contracts.

     

    Information drives businesses who make decisions based on data. Data is a corporate asset. Data modeling is critical to understanding data, its interrelationships, and its rules. Yet, some people don‘t understand the value that data modeling provides. Some perceive it as just documentation, as a bottleneck to agile development, or even as too expensive to be worth it.

     

    Additionally, there has been the buzz around the terms NoSQL, schemaless, schema-on-read, and non-relational databases and data lakes, which further promoted the illusion of a silver bullet. But...

     

    *The lack of a data model is still a model -- just a crummy one!  *

     

    Is it realistic to think that one can actually design an application with no structure, no schema, and no relationships?  Isn’t it ironic also that schema design is one of NoSQL’s toughest challenges, triggering countless how-to videos, blogs, and books?

     

    A data model is not just documentation!  One artifact out of a data model is the schema contract used by producers and consumers of data to understand each other when communicating. This can be the forward-engineered of a DDL in relational databases, or an Avro schema used in Kafka pub-sub pipelines, or a REST API specification. 

     

    Not only is data modeling not a bottleneck to application development, it has repeatedly demonstrated that it accelerates development, significantly reduces maintenance, increases application quality, and lowers execution risks across the enterprise. Experience has shown that just relying on the intuition of software developers is not a repeatable process or one insuring first-time-right success.

     

    "In many ways, up-front data design with NoSQL databases can actually be more important than it is with traditional relational databases [...] Beyond the performance topic, NoSQL databases with flexible schema capabilities require more discipline in aligning to a common information model." Ryan Smith, Information Architect at Nike

     

    You may read more details in the following pages, then pursue with the following sections in this online documentation, including Getting Started, Tutorials, How-To Guides, and the rest of the user manual. 

     

    Enjoy our next-gen data modeling tool for 21st Century technology.