Powered by RND
PodcastsTechnologyMetaDAMA - Data Management in the Nordics
Listen to MetaDAMA - Data Management in the Nordics in the App
Listen to MetaDAMA - Data Management in the Nordics in the App
(471)(247,963)
Save favourites
Alarm
Sleep timer

MetaDAMA - Data Management in the Nordics

Podcast MetaDAMA - Data Management in the Nordics
Winfried Adalbert Etzel - DAMA Norway
This is DAMA Norway's podcast to create an arena for sharing experiences within Data Management, showcase competence and level of knowledge in this field in the...

Available Episodes

5 of 72
  • 4#10 - Geir Myrind - The Revival of Data Modeling (Nor)
    "Vi modellerer for å forstå, organisere og strukturere dataene." / "We model to understand, organize, and structure the data."This episode with Geir Myrind, Chief Information Architect, offers a deep dive into the value of data modeling in organizations. We explore how unified models can enhance the value of data analysis across platforms and discuss the technological development trends that have shaped this field. Historical shifts toward more customized systems have also challenged the way we approach data modeling in public agencies such as the Norwegian Tax Administration.Here are my key takeaways:StandardizationStandardization is a starting point to build a foundation, but not something that let you advance beyond best practice.Use standards to agree on ground rules, that can frame our work, make it interoperable.Conceptual modeling is about understanding a domain, its semantics and key concepts, using standards to ensure consistency and support interoperability.Data ModelingModeling is an important method to bridge business and data.More and more these conceptual models gain relevance for people outside data and IT to understand how things relate.Models make it possible to be understood by both humans and machines.If you are too application focused, data will not reach its potential and you will not be able to utilize data models to their full benefits.This application focus which has been prominent in mainstream IT for many years now is probably the reason why data modeling has lost some of its popularity.Tool advancement and new technology can have an impact on Data Management practices.New tools need a certain data readiness, a foundation to create value, e.g. a good metadata foundation.Data Modeling has often been viewed as a bureaucratic process with little flexibility.Agility in Data Modeling is about modeling being an integrated part of the work - be present, involved, addressed.The information architect and data modeling cannot be a secretary to the development process but needs to be involved as an active part in the cross-functional teams.Information needs to be connected across domains and therefore information modeling should be connected to business architecture and process modeling.Modeling tools are too often connected only to the discipline you are modeling within (e.g. different tools for Data vs. Process Modeling).There is substantial value in understanding what information and data is used in which processes and in what way.The greatest potential is within reusability of data, its semantics and the knowledge it represents.The role of Information ArchitectInformation Architects have played a central role for decades.While the role itself is stable it has to face different challenges today.Information is fluctuant and its movement needs to be understood, be it through applications or processes.Whilst modeling is a vital part of the work, Information Architects need to keep a focus on the big picture and the overhauling architecture.Information architects are needed both in projects and within domains.There is a difference between Information and Data Architects. Data Architects focus on the data layer, within the information architecture, much closer to decisions made in IT.The biggest change in skills and competency needs for Information Architects is that they have to navigate a much more complex and interdisciplinary landscape.MetadataData Catalogs typically include components on Metadata Management.We need to define Metadata broader - it includes much more than data about data, but rather data about things.
    --------  
    41:25
  • 4#9 - Marte Kjelvik & Jørgen Brenne - Healthcare Data Management: Towards Standardization and Integration (Nor)
    "Den største utfordringen, det viktigste å ta tak i, det er å standardisere på nasjonalt nivå. / The biggest challenge, the most important thing to address, is standardizing at the national level."The healthcare industry is undergoing a significant transformation, driven by the need to modernize health registries and create a cohesive approach to data governance. At the heart of this transformation is the ambition to harness the power of data to improve decision-making, streamline processes, and enhance patient outcomes. Jørgen Brenne, as a technical project manager, and Marte Kjelvik’s team, have been instrumental in navigating the complexities of this change. Their insights shed light on the challenges and opportunities inherent in healthcare data modernization.Here are my key takeaways:Healthcare data and registryIts important to navigate different requirements from different sources of authority.To maintain comprehensive, secure, and well-managed data registries is a challenging task.We need a national standardized language to create a common understanding of health data, what services we offer within healthcare and how they align.Authorities need also to standardize requirements for code and systems.National healthcare data registry needs to be more connected to the healthcare services, to understand data availability and data needs.CompetencyData Governance and Data Management are the foundational needs the registry has recognized.Dimensional Modeling was one of the first classes, they trained their data team on, to ensure this foundational competency.If the technology you choose supports your methodology, your recruitment of new resources becomes easier, since you don’t need to get experts on that very methodology.ModelsUser stories are a focus point and prioritized. Data Lineage (How data changed through different systems) is not the same as Data Provenience (Where is the datas origin). You need both to understand business logic and intent of collection) - User stories can help establish that link.Understanding basic concepts and entities accounts for 80% of the work.Conceptual models ensured to not reflect technical elements.These models should be shareable to be a way to explain your services externally.Could first provides an open basis to work from that can be seen as an opportunity.There are many possibilities to ensure security, availability, and discoverability.Digitalization in Norwegian public services has brought forth a set of common components, that agencies are encouraged to use across public administration.Work based on experiences and exchange with others, while ensuring good documentation of processes.Find standardized ways of building logical models, based on Data Contracts.By using global business keys, you can ensure that you gain structured insight into the data that is transmitted.Low Code tools generate generic code, based on the model to ensure effective distribution and storage of that data in the registry.The logical model needs to capture the data needs of the users.Data Vault 2.0 as a modeling tool to process new dats sources and adhering to a logical structure.There is a discipline reference group established to ensure business alignment and verification of the models.Data should be catalogued as soon as it enters the system to capture the accompanying logic.Data VaultAdaptable to change and able to coordinated different sources and methods.It supports change of formats without the need to change code.It makes parallel data processing possible at scale.Yet due to the heterogeneity of data vault, you need some tool to mange.
    --------  
    30:44
  • Holiday Special: Joe Reis - A Journey around the World of Data (Eng)
    «Data Management is an interesting one: If it fails, what’s the feedback loop?»For the Holiday Special of Season 4, we’ve invited the author of «Fundamentals of Data Engineering», podcast host of the «Joe Reis Show», «Mixed Model Arts» sensei, and «recovering Data Scientist» Joe Reis. Joe has been a transformative voice in the field of data engineering and beyond. He is also the author of the upcoming book with the working title "Mixed Model Arts", which redefines data modeling for the modern era.  This episode covers the evolution of data science, its early promise, and its current challenges. Joe reflects on how the role of the data scientist has been misunderstood and diluted, emphasizing the importance of data engineering as a foundational discipline. We explore why data modeling—a once-vital skill—has fallen by the wayside and why it must be revived to support today’s complex data ecosystems. Joe offers insights into the nuances of real-time systems, the significance of data contracts, and the role of governance in creating accountability and fostering collaboration.  We also highlight two major book releases: Joe’s "Mixed Model Arts", a guide to modernizing data modeling practices, and our host Winfried Etzel’s book on federated Data Governance, which outlines practical approaches to governing data in fast-evolving decentralized organizations. Together, these works promise to provide actionable solutions to some of the most pressing challenges in data management today.  Join us for a forward-thinking conversation that challenges conventional wisdom and equips you with insights to start rethinking how data is managed, modeled, and governed in your organization.Some key takeaways:Make Data Management tangibleData management is not clear enough to be understood, to have feedback loops, to ensure responsibility to understand what good looks like.Because Data Management is not always clear enough, there is a pressure to make it more tangible.That pressure is also applied to Data Governance, through new roles like Data Governance Engineers, DataGovOps, etc.These roles mash enforcing policies with designing policies.Data ContractsShift Left in Data needs to be understood more clearly, towards a closer understanding and collaboration with source systems.Data Contracts are necessary, but it’s no different from interface files in software. It’s about understanding behavior and expectations.Data Contracts are not only about controlling, but also about making issues visible.Data GovernanceThink of Data Governance as political parties. Some might be liberal, some more conservative.We need to make Data Governance lean, integrated and collaborative, while at the same time ensuring oversight and accountability.People need a reason to care about governance rules and held accountable.If not Data Governance «(...) ends up being that committee of waste.»The current way Data Governance is done doesn’t work. It needs a new look.Enforcing rules, that people don’t se ant connection to or ownership within are deemed to fail.We need to view ownership from two perspectives - a legal and a business perspective. They are different.Data ModelingBusiness processes, domains and standards are some of the building blocks for data.Data Modeling should be an intentional act, not something you do on the side.The literature on Data Modeling is old, we are stuck in a table-centric view of the world.
    --------  
    53:47
  • 4#8 - Shuang Wu - Service Platform: From Analytics to AI-Driven Success (Eng)
    «We want to make data actionable.»Join us for an engaging conversation with Shuang Wu, Mesta's lead data engineer. We delve into the concept of platforms and explore how they empower autonomous delivery teams, making data-driven decisions a central part of their strategy.Shuang discusses the intricate process of evolving from a mere data platform to a comprehensive service platform, especially within organizations that aren't IT-centric. Her insights emphasize a lean, agile approach to prioritize use cases, focusing on quick iterations and prototypes that foster self-service and data democratization. We explore the potential shift towards a decentralized data structure where domain teams leverage data more effectively, driving operational changes and tangible business value in their pursuit of efficiency and impact.My key learnings:It’s not just about gaining insights, but also about harmonizing and understanding data in context.Find your SMEs and involve them closely - you need insight knowledge about the data and pair that with engineering capabilities.Over time the SMEs and the central data team share experiences and knowledge. This creates a productive ground for working together. The more understanding business users gain on data, the more they want to build themselves.Central team delivers core data assets in a robust and stable manner. Business teams can build on that.The DataYou can integrate and combine internal data with external sources (like weather data, or road network data) to create valuable insights.Utilizing external data can save you efforts, since it often is structured and API ready.Dont over-engineer solutions - find you what your user-requirements are and provide data that match the requirements, not more.Use an agile approach to prioritize use cases together with your business users.Ensure you have a clear picture of potential value, but also investment and cost.Work in short iterations, to provide value quickly and constantly.Understand your platform constrains and limitations, also related to quality.Find your WHY! Why am I doing the work and what does that mean when it comes to prioritization?What is the value, impact and effort needed?Service Platform:Is about offering self-service functionality.Due to the size of Mesta it made sense to take ownership for many data products centrally, closely aligned with the platform.Build it as a foundation, that can give rise to different digitalization initiatives.If you want to make data actionable they need to be discoverable first.The modular approach to data platform allows you to scale up required functionality when needed, but also to scale to zero if not.Verify requirements as early as you can.Working with business use casesVisibility and discoverability of data stays a top priority.Make data and AI Literacy use case based, hands-on programsYou need to understand constrains when selecting and working with a business use case.Start with a time-bound requirements analysis process, that also analyses constraints within the data.Once data is gathered and available on the platform, business case validity is much easier to verify.Gather the most relevant data first, and then see how you can utilize it further once it is structured accordingly.Quite often ideas originate in the business, and then the central data team is validating if the data can support the use case.
    --------  
    41:11
  • 4#7 - Victor Undli - From Hype to Innovation: Navigating Data Science and AI in Norway (Eng)
    «I think we are just seeing the beginning of what we can achieve in that field.»Step into the world of data science and AI as we welcome Victor Undli, a leading data scientist from Norway, who shares his  insights into how this field has evolved from mere hype to a vital driver of innovation in Norwegian organizations. Discover how Victor's work with Ung.no, a Norwegian platform for teenagers, illustrates the profound social impact and value creation potential of data science, especially when it comes to directing young inquiring minds to the right experts using natural language processing. We'll discuss the challenges that organizations face in adopting data science, particularly the tendency to seek out pre-conceived solutions instead of targeting real issues with the right tools. This episode promises to illuminate how AI can enhance rather than replace human roles by balancing automation with human oversight.Join us as we explore the challenges of bridging the gap between academia and industry, with a spotlight on Norway's public sector as a cautious yet progressive player in tech advancement. Victor also shares his thoughts on developing a Norwegian language model that aligns with local values and culture, which could be pivotal as the AI Act comes into play. Learn about the unique role Norway can adopt in the AI landscape by becoming a model for small countries in utilizing large language models ethically and effectively. We highlight the components of successful machine learning projects: quality data, a strong use case, and effective execution, and encourage the power of imagination in idea development, calling on people from all backgrounds to engage.Here are my key takeaways:Get started as Data ScientistExpectations from working with cutting edge tech, and chasing the last percentage of precision.Reality is much more messy.Time management and choosing ideas carefully is important.«I end up with creating a lot of benchmark models with the time given, and then try to improve them in a later iteration.»Data Science studies is very much about deep diving into models and their performance, almost unconcerned with technical limitations.A lot of tasks when working with Data Science are in fact Data Engineering tasks.Closing the gap between academia and industry is going to be hard.Data Science is a team sport - you want someone to exchange with and work together with.Public vs. PrivatThere is a difference between public and privat sector in Norway.Public sector in Norway is quite advanced in technological development.Public sector acts more carefully.Stakeholder Management and Data QualityIt is important to communicate clearly and consistently with your stakeholders.You have to compromise between stakeholder expectation and your restrains.If you don’t curate your data correctly, it will loose some of its potential over time.Data Quality is central, especially when used for AI models.Data Curation is also a lot about Data Enrichments - filling in the gaps.AI and the need for a Norwegian LLMAI can be categorized into the brain and the imagination.The brain is to understand, the imagination is to create.We should invest time into creating open source, Norwegian LLM, as a competitive choice.Language encapsulates culture. You need to embrace language to understand culture.Norways role is a sa strong consumer of AI. That also means to lead by example.Norway and the Nordic countries can bring a strong ethical focus to the table.
    --------  
    31:27

More Technology podcasts

About MetaDAMA - Data Management in the Nordics

This is DAMA Norway's podcast to create an arena for sharing experiences within Data Management, showcase competence and level of knowledge in this field in the Nordics, get in touch with professionals, spread the word about Data Management and not least promote the profession Data Management.-----------------------------------Dette er DAMA Norge sin podcast for å skape en arena for deling av erfaringer med Data Management​, vise frem kompetanse og kunnskapsnivå innen fagfeltet i Norden​, komme i kontakt med fagpersoner​, spre ordet om Data Management og ikke minst fremme profesjonen Data Management​.
Podcast website

Listen to MetaDAMA - Data Management in the Nordics, TechStuff and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v7.6.0 | © 2007-2025 radio.de GmbH
Generated: 2/8/2025 - 12:31:56 PM