PodcastsTechnologyThe Tech Trek

The Tech Trek

Elevano
The Tech Trek
Latest episode

621 episodes

  • The Tech Trek

    The Hidden Fintech Behind the Compute Boom

    2026/2/11 | 23 mins.
    Gabe Ravacci, CTO and co-founder at Internet Backyard, breaks down what the “computer economy” really looks like when you zoom in on data centers, billing, invoicing, and the financial plumbing nobody wants to touch. He shares how a rejected YC application, a finance stint, and a handful of hard lessons pushed him from hardware curiosity to building fintech infrastructure for compute.
    If you care about where compute is headed, or you are early in your career and trying to find your path without overplanning it, this one will land.

    Key Takeaways
    • Startups often happen “by accident” when your competence meets the right problem at the right time
    • Compute accessibility is not only a chip problem, it is also a finance and operations problem
    • Rejection can be data, not a verdict, treat it as feedback to sharpen the craft
    • A real online presence is less about networking and more about being genuinely useful in public
    • Time blocking and single task focus beats grinding when you are juggling school, work, and a startup

    Timestamped Highlights
    00:28 What Internet Backyard is building, fintech infrastructure for data center financial operations
    01:37 The first startup attempt, cheaper compute via FPGA based prototyping, and why investors passed
    04:48 The pivot, from hardware tools to a finance informed view of compute and transparency gaps
    06:55 How Gabe reframed YC rejection, process over outcome, “a tree of failures” that builds skill
    08:29 Building a digital brand on X, what he posted, how he learned in public, and why it worked
    13:36 The real balancing act, dropping classes, finishing the degree well, and strict time blocking
    20:00 Books that shaped his thinking, Siddhartha, The Art of Learning, Finite and Infinite Games

    A line worth keeping
    “The process is really more important than any outcome.”

    Pro Tips for builders
    • Treat learning like a skill, ask better questions before you chase better answers
    • Make focus a system, set blocks, mute distractions, and do one thing at a time
    • Share what you are learning in public, not to perform, but to be useful and find signal

    Call to Action
    If this episode sparked an idea, follow or subscribe so you do not miss the next one. Also check out Amir’s newsletter for more conversations at the intersection of people, impact, and technology.
  • The Tech Trek

    Data Fabric Meets AI, The Trust Layer Most Teams Skip

    2026/2/10 | 29 mins.
    Data leaders are being asked to ship real AI outcomes while the foundations are still messy. In this conversation, Dave Shuman, Chief Data Officer at Precisely, breaks down what actually determines whether AI adoption sticks, from hiring “comb shaped” talent to building trusted data products that make AI outputs believable and usable.

    If you are building in data, AI, or analytics, this episode is a practical map for what needs to be true before AI can move from demos to dependable, repeatable impact.

    Key Takeaways

    Comb shaped talent beats narrow specialization, AI work rewards people who can span multiple skills and collaborate well
    Adoption is a trust problem, and trust starts with data integrity, lineage, context, and a semantic layer that business users can understand
    Open source drives the innovation, commercialization makes it safe and usable at enterprise scale, especially around security and support
    Data must be fit for purpose, start every AI project by asking what data it needs, who curates it, and what the known warts are
    Humans are still the last mile, small workflow choices can make adoption jump, even when the model is already accurate

    Timestamped Highlights

    00:56 The shift from T shaped to comb shaped talent, what modern AI teams actually need to look like
    05:36 Hiring for team fit over “world class” niche skills, and when to bring in trusted partners for depth
    07:37 How open source sparks the ideas, and why enterprises still need hardened, supported versions to scale
    11:31 Where AI adoption is today, why summarization is only the beginning, and what unlocks “AI 2.0”
    13:39 The trust stack for AI, clean integrated data, lineage, context, catalog, semantic layer, then agents
    19:26 A real adoption lesson from machine learning, and why the human experience decides if the system wins

    A line worth stealing

    “You do not just take generative AI and throw it at your chaos of data and expect it to make magic out of it.”

    Pro Tips for data and AI leaders

    Hire and build teams like Tetris, fill skill voids across the group instead of chasing one perfect profile
    Use partners for the sharp edges, but require knowledge transfer so your team levels up every engagement
    Make adoption easier by designing for human behavior, sometimes the smallest workflow tweak beats more accuracy
    Build governed data products in a catalog, then validate AI outputs side by side with dashboards to earn trust fast

    Call to Action

    If this helped you think more clearly about AI adoption, talent, and data foundations, follow the show and turn on notifications so you do not miss the next episode. Also, share it with one data or engineering leader who is trying to get AI out of pilots and into real workflows.
  • The Tech Trek

    Cloud Costs vs AI Workloads, The Storage Decisions That Decide Scale

    2026/2/09 | 26 mins.
    Cloud bills are climbing, AI pipelines are exploding, and storage is quietly becoming the bottleneck nobody wants to own. Ugur Tigli, CTO at MinIO, breaks down what actually changes when AI workloads hit your infrastructure, and how teams can keep performance high without letting costs spiral.

    In this conversation, we get practical about object storage, S3 as the modern standard, what open source really means for security and speed, and why “cloud” is more of an operating model than a place.

    Key takeaways

    • AI multiplies data, not just compute, training and inference create more checkpoints, more versions, more storage pressure
    • Object storage and S3 are simplifying the persistence layer, even as the layers above it get more complex
    • Open source can improve security feedback loops because the community surfaces regressions fast, the real risk is running unsupported, outdated versions
    • Public cloud costs are often less about storage and more about variable charges like egress, many teams move data on prem to regain predictability
    • The bar for infrastructure teams is rising, Kubernetes, modern storage, and AI workflow literacy are becoming table stakes

    Timestamped highlights

    00:00 Why cloud and AI workloads force a fresh look at storage, operating models, and cost control
    00:00 What MinIO is, and why high performance object storage sits at the center of modern data platforms
    01:23 Why MinIO chose open source, and how they balance freedom with commercial reality
    04:08 Open source and security, why faster feedback beats the closed source perception, plus the real risk factor
    09:44 Cloud cost realities, egress, replication, and why “fixed costs” drive many teams back inside their own walls
    15:04 The persistence layer is getting simpler, S3 becomes the standard, while the upper stack gets messier
    18:00 Skills gap, why teams need DevOps plus AIOps thinking to run modern storage at scale
    20:22 What happens to AI costs next, competition, software ecosystem maturity, and why data growth still wins

    A line worth keeping

    “Cloud is not a destination for us, it’s more of an operating model.”

    Pro tips for builders and tech leaders

    • If your AI initiative is still a pilot, track egress and data movement early, that is where “surprise” costs tend to show up
    • Standardize around containerized deployment where possible, it reduces the gap between public and private environments, but plan for integration friction like identity and key management
    • Treat storage as a performance system, not a procurement line item, the right persistence layer can unblock training, inference, and downstream pipelines

    What's next:
    If you’re building with AI, running data platforms, or trying to get your cloud costs under control, follow the show and subscribe so you do not miss upcoming episodes. Share this one with a teammate who owns infrastructure, data, or platform engineering.
  • The Tech Trek

    AI Is Changing Art Faster Than You Think.

    2026/2/06 | 50 mins.
    This is an early conversation I am bringing back because it feels even more relevant now, the intersection of AI and art is turning into a real cultural shift.

    I sit down with Marnie Benney, independent curator at the intersection of contemporary art and technology, and co-founder of AIartists.org, a major community for artists working with AI. We talk about what AI art actually is beyond the headlines, where authorship gets messy, and why artists might be the best people to pressure test the societal impact of machine learning.

    Key takeaways

    • AI in art is not a single thing, it is a spectrum of choices, dataset, process, medium, and intent
    • The most interesting work treats AI as a collaborator, not a shortcut, a back and forth that reshapes the artist’s decisions
    • Authorship is still unsettled, some artists see AI as a tool like an instrument, others treat it as a creative partner
    • The fear that AI replaces creativity misses the point, artists can use the machine’s unexpected output to expand human expression
    • Access matters, compute, tooling, and collaboration between artists and technologists will shape who gets to experiment at the frontier

    Timestamped highlights

    00:04:00 Curating science, climate, and public engagement, the path into tech driven exhibitions
    00:07:41 What AI art can mean in practice, datasets, iteration loops, and choosing an output medium
    00:10:48 Who gets credit, tool versus collaborator, and the art world’s evolving rules
    00:13:51 Fear, job displacement, and a healthier frame, human plus machine as a creative partnership
    00:22:57 The new skill stack, what artists need to learn, and where collaboration beats handoffs
    00:29:28 The pushback from traditional art circles, philosophy and intention versus novelty
    00:37:17 Inside the New York exhibition, collaboration between human and machine, visuals, sculpture, and sound
    00:48:16 The magic of the unknown, why the output can surprise even the artist

    A line that stuck

    “Artists are largely showing a mirror to society of what this technology is, for the positive and the negative.”

    Pro tips for builders and operators

    • Treat creative communities as an early signal, artists surface second order effects before markets do
    • If you are building AI products, study authorship debates, they map directly to credit, accountability, and trust
    • Collaboration beats delegation, when domain experts and technologists iterate together, the work gets sharper fast

    Call to action

    If this episode hits for you, follow the show so you do not miss the next drop. And if you are building in data, AI, or modern tech teams, follow me on LinkedIn for more conversations that connect technology to real world impact.
  • The Tech Trek

    AI in the Enterprise, Why Pilots Fail and What Actually Scales

    2026/2/05 | 23 mins.
    Most teams are approaching AI from the wrong direction, either chasing the tech with no clear problem or spinning up endless pilots that never earn their keep. In this episode, Amir Bormand sits down with Steve Wunker, Managing Director at New Markets Advisors and co author of AI and the Octopus Organization, to break down what actually works in enterprise AI.

    You will hear why the real challenge is organizational, not technical, how IT and business have to co own the outcome, and what it takes to keep AI systems valuable over time. If you are trying to move beyond experimentation and into real impact, this conversation gives you a practical blueprint.

    Key takeaways

    • Pick a handful of high impact problems, not hundreds of small pilots, focus is what creates measurable ROI
    • Treat AI as a workflow and change program, not a tool you bolt onto an existing process
    • IT has to evolve from order taker to strategic partner, including stronger AI ops and ongoing evaluation
    • Start with the destination, redefine the value proposition first, then redesign the operating model around it
    • Ongoing ownership matters, AI is not a one and done delivery, it needs stewardship to stay useful

    Timestamped highlights

    00:39 What New Markets Advisors actually does, innovation with a capital I, plus AI in value props and operations
    01:54 The two common mistakes, pushing AI everywhere and launching hundreds of disconnected pilots
    04:19 Why IT cannot just take orders anymore, plus why AI ops is not the same as DevOps
    07:56 Why the octopus is the perfect model for an AI age organization, distributed intelligence and rapid coordination
    11:08 The HelloFresh example, redesign the destination first, then let everything cascade from that
    17:37 The line you will remember, AI is an ongoing commitment, not a project you ship and forget
    20:50 A cautionary pattern from the dotcom era, avoid swinging from timid pilots to extreme headcount mandates

    A line worth keeping

    You cannot date your AI system, you need to get married to it.

    Pro tips for leaders building real AI outcomes

    • Define success metrics before you build, then measure pre and post, otherwise you are guessing
    • Redesign the process, do not just swap one step for a model, aim for fewer steps, not faster steps
    • Assign long term ownership, budget for maintenance, evaluation, and model oversight from day one

    Call to action

    If this episode helped you rethink how to drive AI results, follow the show and subscribe so you do not miss the next conversation. Share it with a leader who is stuck in pilot mode and wants a path to production.

More Technology podcasts

About The Tech Trek

The Tech Trek is a podcast for founders, builders, and operators who are in the arena building world class tech companies. Host Amir Bormand sits down with the people responsible for product, engineering, data, and growth and digs into how they ship, who they hire, and what they do when things break. If you want a clear view into how modern startups really get built, from first line of code to traction and scale, this show takes you inside the work.
Podcast website

Listen to The Tech Trek, Darknet Diaries and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

The Tech Trek: Podcasts in Family

Social
v8.5.0 | © 2007-2026 radio.de GmbH
Generated: 2/12/2026 - 8:15:03 PM