Today marks a significant milestone in my career – 5 years at Microsoft. It’s been an incredible journey, filled with growth, innovation, and a sense of community that I deeply cherish.
As I look back, I am filled with gratitude for the experiences and lessons learned along the way:
1. A Remarkably Supportive and Collaborative Culture
From my very first day at Microsoft, I was welcomed into a culture that values collaboration and support. The camaraderie among the team members is truly special. Whether it’s working on challenging projects or brainstorming new ideas, there’s always a sense of unity and mutual respect. This supportive environment has been a cornerstone of my growth and success here.
2. AI: The Heartbeat of Our Work
At Microsoft, AI isn’t just a buzzword – it’s the heartbeat of our work. I’ve had the privilege of witnessing firsthand how AI can drive transformative change across various industries. The innovative solutions we develop are not only cutting-edge but also have a profound impact on the world. It’s exhilarating to be part of a team that’s pushing the boundaries of what’s possible with AI.
3. Commitment to Responsible AI
One of the aspects I admire most about Microsoft is our unwavering commitment to responsible AI. We are dedicated to creating technology with integrity and humility. Every team, project, and initiative reflects this dedication. The emphasis on ethical AI development ensures that we are building a future where technology serves humanity positively and equitably.
4. Tech for Social Impact: Shaping a Better Future
Working with Tech for Social Impact (TSI) has been one of the most rewarding experiences of my career. We are not just envisioning a better future; we are actively shaping it. Our ambitious vision for Copilot is just the beginning of a transformative journey. The work we do at TSI has the potential to create significant positive change, and I am proud to be part of this mission.
5. Continuous Learning and Innovation
Continuous learning and innovation are at the core of Microsoft’s success. The opportunities for growth are endless, from the exciting projects in the Microsoft Garage to the numerous volunteering initiatives. Staying curious and constantly seeking to learn new things is encouraged and celebrated. This culture of continuous improvement is a driving force behind our collective achievements.
A big thank you to the mentors, managers, and colleagues who have been incredibly supportive. I’m immensely proud of our collective achievements and can’t wait to see what the future holds! Here’s to many more years of innovation, impact, and a shared vision of creating technology that empowers everyone.
If you’d like to stay updated on my journey and insights on AI, digital transformation, and more, follow me on LinkedIn, Twitter, and Instagram.
In der neuesten Folge von Die Digitalisierung und Wir haben wir uns ein ganz besonderes Thema vorgenommen: Singapur! Florian war vor Kurzem in der Stadt, die als eines der globalen Zentren für Technologie und Innovation gilt, und teilt seine Eindrücke und Erfahrungen mit uns. In diesem Beitrag erfahrt ihr, wie sich die Digitalisierung in Singapur auf verschiedene Lebensbereiche auswirkt und was wir davon lernen können.
Servierroboter und Laborfleisch: Die Zukunft der Gastronomie?
In Singapur sind technologische Innovationen nicht nur in den Büros und Laboren zu finden, sondern auch in den Restaurants. Florian berichtet von seinen Erfahrungen mit Servierrobotern, die in einigen Restaurants das Essen an die Tische bringen. Doch nicht nur der Service, auch die Speisen selbst sind futuristisch: Fleisch aus dem Labor ist hier keine Seltenheit und könnte eine Antwort auf die wachsende Nachfrage nach nachhaltigen Lebensmitteln sein.
Smart City: Vom Verkehr bis zur Architektur
Die U-Bahnen in Singapur sind ein Paradebeispiel für benutzerfreundlichen und effizienten öffentlichen Nahverkehr. Aber auch auf der Straße zeigt sich die Digitalisierung in Singapur: Der Taxidienst Grab hat das Fortbewegen in der Stadt revolutioniert. Und wer durch Singapur geht, dem fallen sofort die futuristischen Gebäude und die metamoderne Architektur auf.
Digitale Kommunikation und Innovation
WhatsApp ist in Singapur allgegenwärtig und wird sogar im öffentlichen Raum eingesetzt. Die Stadt ist zudem Heimat für das japanische VC-Unternehmen Softbank, das weltweit in Technologie und Innovation investiert. Und es gibt Spekulationen, dass Jony Ive zusammen mit OpenAI an einem neuen KI-Gerät arbeitet.
Zum Abschluss möchten wir noch einen Blick über den Tellerrand hinaus werfen: Künstliche Intelligenz ist nicht nur in den Straßen Singapurs präsent, sondern auch in den Büros moderner Unternehmen unverzichtbar geworden. Im Blogpost Künstliche Intelligenz im Controlling: Ein unverzichtbares Werkzeug für moderne Unternehmen gehe ich auf die vielfältigen Einsatzmöglichkeiten von KI-Technologien wie GPT-4 und DALL-E 3 ein und illustriere anhand zahlreicher Praxisbeispiele, wie sie Controlling-Prozesse durch KI weiterentwickelt werden.
Verpassen Sie nicht unsere weiteren spannenden Diskussionen über die Schnittstellen von Technologie und Wirtschaft – abonnieren Sie den Podcast Die Digitalisierung und Wir und bleiben Sie stets informiert über die neuesten Trends und Entwicklungen.
Welcome to another edition of the Data & AI Digest! We’re excited to bring you a curated selection of the week’s most compelling stories in the realm of data science, artificial intelligence, and more. Whether you’re a seasoned expert or a curious beginner, there’s something here for everyone.
[AI] Understanding AI Performance: Discover how modern AI models often match or exceed human capabilities in tests, yet struggle in real-world applications. Read more
[AI] Generative AI Strategy for Tech Leaders: CIOs and CTOs need to integrate generative AI into their tech architecture effectively. Explore 5 key elements for successful implementation. Read more
[Statistics] Mastering the Central Limit Theorem in R: Understand the Central Limit Theorem, a cornerstone in statistics, and learn how to simulate it using R in this step-by-step tutorial. Read more
[Graph Theory] Comprehensive Introduction to Graph Theory: This quarter-long course covers everything from simple graphs to Eulerian circuits and spanning trees. Read more
[SQL] SQL Konferenz Highlights on Microsoft Fabric: Get an in-depth look at Microsoft Fabric and its role as a Data Platform for the Era of AI. Read more
[Microsoft] Forbes Insights on Microsoft’s Copilots: Learn six critical things every business owner should know about Microsoft Copilot. Read more
[GitHub] How GitHub’s Copilot is Being Used: GitHub’s Copilot remains the most popular AI-based code completion service. Find out the latest usage trends. Read more
[Apple] iPhone 15 Pro’s Spatial Videos: Teased at Apple’s latest keynote, learn about the new spatial video capabilities of the iPhone 15 Pro. Read more
[Geopolitics] China’s AI Influence Campaign: Researchers from Microsoft and other organizations discuss Beijing’s rapid change in disinformation tactics through AI. Read more
That’s a wrap for this week’s Data & AI Digest! We hope you found these articles insightful and thought-provoking. If you enjoyed this issue, help us make it bigger and better by sharing it with colleagues and friends. 🚀
Don’t forget, for real-time updates and discussions, join our LinkedIn Data & AI User Group. We look forward to your active participation and valuable insights.
In the complex world of data analytics, a data lake serves as a centralized repository where you can store all your structured and unstructured data at any scale. It offers immense flexibility, allowing you to run big data analytics and adapt to the needs of various types of applications. But imagine having more than just a data lake. Imagine having an entire suite of data management and analytics services that work seamlessly together. That’s where Microsoft Fabric comes in.
Microsoft Fabric is an all-in-one analytics solution designed for enterprises. It spans everything from data movement and data science to Real-Time Analytics and business intelligence. It offers a comprehensive suite of services, including a data lake, data engineering, and data integration, all conveniently located in one platform.
Use Cases of Microsoft Fabric in Data-Driven Companies
Microsoft Fabric covers all analytics requirements relevant to a Data-Driven Company. Every user group, from Data Engineers to Data Analysts to Data Scientists, can work with the data in a unified way and easily share the results with others. The areas of application at a glance:
Data Engineering: Data injected with the Data Factory can be transformed with high performance on a Spark platform and democratized via the Lakehouse. Models and key figures are created directly in Fabric.
Self-Service Analytics: Following the data mesh paradigm, a single data team can be provided with a decentralized self-service platform for building and distributing their own data products.
Data Science: Azure Machine Learning functionalities are available by default. Machine learning models for applied AI can be trained, deployed, and operationalized in the Fabric environment.
Real-Time Analytics: With Real-Time Analytics, Fabric includes an engine optimized for analyzing streaming data from a wide variety of sources – such as apps, IoT devices, or human interaction.
Data Governance: The OneLake as a unified repository enables IT teams to centrally manage and monitor governance and security standards for all components of the solution.
Users can also be supported at all levels by AI technologies. With Microsoft Copilot, Microsoft Fabric offers an intelligent chatbot that translates voice instructions into concrete actions. Developers have the opportunity, for example, to create their program codes, set up data pipelines, or build models for machine learning in this way. In the same way, business users can use the copilot to generate their reports and visualizations for data analysis using voice input alone.
Simplifying Data Analytics: How Microsoft Fabric Offers a Unified, End-to-End Solution
With Fabric, you don’t need to piece together different services from multiple vendors. Instead, you can enjoy a highly integrated, end-to-end, and easy-to-use product that is designed to simplify your analytics needs. One conceivable deployment scenario for the future is data mesh domains with Microsoft Fabric that are connected to an existing lakehouse based on Azure Data Lake Storage Gen2 and Databricks or Synapse. In this setup, the lakehouse continues to handle the core data preparation tasks.
Meanwhile, the decentralized domain teams can use the quality-assured Lakehouse data via Microsoft Fabric using shortcuts to create and deploy their own use cases and data products. Such an approach could prove to be an ideal option, as it optimally complements the advantages of both approaches. The platform is built on a foundation of Software as a Service (SaaS), which takes simplicity and integration to a whole new level.
Microsoft Fabric is not just another addition to the crowded data analytics landscape. Centered around Microsoft’s OneLake data lake, it boasts integrations with Amazon S3 and Google Cloud Platform. The platform consolidates data integration tools, a Spark-based data engineering platform, real-time analytics, and, thanks to upgrades in Power BI, visualization, and AI-based analytics into a single, unified experience.
Microsoft Fabric Pricing Streamlines Your Data Stack for Optimal Cost Efficiency
The rapid innovation in data analytics technologies is a double-edged sword. On one hand, businesses have a plethora of tools at their disposal. On the other, the modern data stack has become increasingly fragmented, making it a daunting task to integrate various products and technologies. Microsoft Fabric aims to eliminate this „integration tax“ that companies have grown tired of paying.
Microsoft Fabric is built around a unified compute infrastructure and a single data lake. This uniformity extends to product experience, governance, and even the business model. The platform brings together all data analytics workloads—data integration, engineering, warehousing, data science, real-time analytics, and business intelligence—under one roof.
Microsoft Fabric introduces a simplified pricing model focused on a common Fabric compute unit. This virtualized, serverless computing allows businesses to optimize costs by reusing the capacity they purchase. The multi-cloud approach, with built-in support for Amazon S3 and upcoming support for Google Storage, ensures that businesses are not locked into a single cloud vendor.
Enhanced Data Governance with Microsoft Purview
Data governance is another area where Microsoft Fabric excels. Using Microsoft Purview, allows businesses to manage data access meticulously. For instance, confidential data exported to Power BI or Excel will automatically inherit the same confidentiality labels and encryption rules, ensuring security.
Microsoft Fabric also offers a no-code developer experience, enabling real-time data monitoring and action triggering. The platform will soon incorporate AI Copilot, designed to assist users in building data pipelines, generating code, and constructing machine learning models.
My Personal Experience so far
Having personally demoed Fabric to over 20 enterprises, the excitement is palpable. The platform simplifies data infrastructure while offering the flexibility of a multi-cloud approach. Most notably, it’s built around the open-source Apache Parquet format, allowing for easier data storage and retrieval.
Microsoft Fabric is currently in public preview and will be enabled for all Power BI tenants starting July 1. The platform promises to be more than just a tool; it aims to be a community where data professionals can collaborate, share knowledge, and grow. So, when someone asks you, „What is Microsoft Fabric?“ you’ll know it’s not just a product; it’s a revolution in data analytics.
Join our Microsoft Fabric & Power Platform LinkedIn Group!
Our LinkedIn group has changed its name to Microsoft Fabric & Power Platform to reflect the evolving ecosystem and the seamless integration between Power Platform technologies like Power BI, Power Apps, and Power Automate with Microsoft Fabric tools like OneLake and Synapse.
If you’re as excited as I am about the future of data analytics and business intelligence, then I’ll invite you to join our LinkedIn group, Microsoft Fabric & Power Platform, a community dedicated to professionals who are eager to stay ahead of industry trends.
Following the talk, I was inspired by a conversation to leverage the power of GPT-4 and create an automatically generated summary of the Microsoft Teams transcript. This approach not only streamlines information sharing but also showcases the practical applications of advanced AI technology.
Below, I will share the key insights generated by GPT-4 and also include some captivating images from the event:
Decisively Digital: AI’s Impact on Society
In my talk, I drew inspiration from my book Decisively Digital, which discusses the impact of AI on society. I shared about the innovative projects underway at Microsoft’s AI for Good Lab. In light of GPT-4’s recent launch, I also highlighted our mission to leverage technology to benefit humanity.
By harnessing Generative AI, we can stimulate the creation of innovative ideas and accelerate the pace of advancement. This cutting-edge technology is already transforming industries by streamlining drug development, expediting material design, and inspiring novel hypotheses. AI’s ability to identify patterns in vast datasets empowers humans to uncover insights that might have gone unnoticed.
Generative AI can Augment our Thinking
For instance, researchers have employed machine learning to predict chemical combinations with the potential to improve car batteries, ultimately identifying promising candidates for real-world testing. AI can efficiently sift through and analyze extensive information from diverse sources, filtering, grouping, and prioritizing relevant data. It can also generate knowledge graphs that reveal associations between seemingly unrelated data points, which can be invaluable for drug research, discovering novel therapies, and minimizing side effects.
„Now is the time to explore how Generative AI can augment our thinking and facilitate more meaningful interactions with others.“
Alexander Loth
At the AI for Good Lab, we are currently employing satellite imagery and generative AI models for damage assessment in Ukraine, with similar initiatives taking place in Turkey and Syria for earthquake relief. In the United States, our focus is on healthcare, specifically addressing discrepancies and imbalances through AI-driven analysis.
Our commitment to diversity and inclusion centers on fostering digital equality by expanding broadband access, facilitating high-speed internet availability, and promoting digital skills development. Additionally, we are dedicated to reducing carbon footprints and preserving biodiversity. For example, we collaborate with the NOAH organization to identify whales using AI technology and have developed an election propaganda index to expose the influence of fake news. Promising initial experiments using GPT-4 showcase its potential for fake news detection.
ChatGPT will be Empowered to Perform Real-time Website Crawling
While ChatGPT currently cannot crawl websites directly, it is built upon a training set of crawled data up to September 2021. In the near future, the integration of plugins will empower ChatGPT to perform real-time website crawling, enhancing its ability to deliver relevant, up-to-date information, and sophisticated mathematics. This same training set serves as the foundation for the GPT-4 model.
GPT-4 demonstrates remarkable reasoning capabilities, while Bing Chat offers valuable references for verifying news stories. AI encompasses various machine learning algorithms, including computer vision, statistical classifications, and even software that can generate source code. A notable example is the Codex model, a derivative of GPT-3, which excels at efficiently generating source code.
Microsoft has a long-standing interest in AI and is dedicated to making it accessible to a wider audience. The company’s partnership with OpenAI primarily focuses on the democratization of AI models, such as GPT and DALL-E. We have already integrated GPT-3 into Power BI and are actively developing integrations for Copilot across various products, such as Outlook, PowerPoint, Excel, Word, and Teams. Microsoft Graph is a versatile tool for accessing XML-based objects in documents and generating results using GPT algorithms.
Hardware, particularly GPUs, has played a pivotal role in the development of GPT-3. For those interested in experimenting with Generative AI on a very technical level, I recommend Stable Diffusion, which is developed by LMU Munich. GPT-3’s emergence created a buzz, quickly amassing a vast user base and surpassing the growth of services like Uber and TikTok. Sustainability remains a crucial concern, and Microsoft is striving to achieve a CO2-positive status.
Generative AI Models have garnered Criticism due to their Dual-use Nature
Despite its potential, Generative AI models such as GPT-3 have also garnered criticism due to their dual-use nature and potential negative societal repercussions. Some concerns include the possibility of automated hacking, photo manipulation and the spread of fake news (➡️ deepfake disussion on LinkedIn). To ensure responsible AI development, numerous efforts are being undertaken to minimize reported biases in the GPT models. By actively working on refining algorithms and incorporating feedback from users and experts, developers can mitigate potential risks and promote a more ethical and inclusive AI ecosystem.
Moving forward, it is essential to maintain open dialogue and collaboration between AI developers, researchers, policymakers, and users. This collaborative approach will enable us to strike a balance between harnessing the immense potential of AI technologies like GPT and ensuring the protection of society from unintended negative consequences.
GPT-3.5 closely mimics human cognition. However, GPT-4 transcends its forerunner with its remarkable reasoning capabilities and contextual understanding. GPT models leverage tokens to establish and maintain the context of the text, ensuring coherent and relevant output. The GPT-4-32K model boasts an impressive capacity to handle 32,000 tokens, allowing it to process extensive amounts of text efficiently. To preserve the context and ensure the continuity of the generated text, GPT-4 employs various strategies that adapt to different tasks and content types.
GPT-4 Features a Robust Foundation in Common Sense Reasoning
One of GPT-4’s defining features is its robust foundation in common sense reasoning. This attribute significantly contributes to its heightened intelligence, enabling the AI model to generate output that is not only coherent but also demonstrates a deep understanding of the subject matter. As GPT-4 continues to evolve and refine its capabilities, it promises to revolutionize the field of artificial intelligence, expanding the horizons of what AI models can achieve and paving the way for future breakthroughs in the realm of generative AI.
In the near future, advanced tools like ChatGPT will elucidate intricate relationships without requiring us to sift through countless websites and articles, further amplifying the transformative impact of Generative AI.
I appreciate the opportunity to share my insights at the German Chapter of the ACM.
Did you enjoy this GPT-generated Summary of my Talk?
Leveraging GPT-4 to generate a summary of my talk was an exciting experiment, and I have to admit, the results are impressive. GPT was able to provide a brief overview of the key takeaways from my talk.
Now, I would love to hear about your experiences with GPT. What are your experiences with GPT so far? Feel free to share your thoughts in the comments section of this Twitter thread or this LinkedIn post:
We use cookies to optimize our website and our service.
Functional
Immer aktiv
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.