MindByte Issue #89: GitHub’s Victory, Copilot Innovations, and .NET 9 Performance Boosts

In partnership with

Welcome back, and for all the new subscribers, welcome aboard!

To ensure you keep getting these updates seamlessly, please move this email to your primary inbox or mark it as important. A quick reply like "got it" also helps boost visibility. This edition covers exciting topics such as:

  • Why was GitHub the actual winner between the source control hosting systems?

  • Copilot updates and using the new o1 preview model

  • Build your own GitHub Copilot Extension

  • Host OpenAI behind API Management

  • Performance improvements in .NET 9

  • CQRS Command Handlers with Marten

New here? Subscribe here to stay updated. Let's dive in.

Sending out this newsletter to 5,530 subscribers is something I do with love, but it does cost money as well. It would really help me if you visit my sponsor:

The Daily Newsletter for Intellectually Curious Readers

  • We scour 100+ sources daily

  • Read by CEOs, scientists, business owners and more

  • 3.5 million subscribers

Interested in sponsoring this newsletter? Contact me!

GitHub Digest

Interesting story about why GitHub became the winner and is now the top platform for Git hosting. It boils down to two things: they started at the right time and had good taste.

Of course, there is more to the story, so read the full history:

With the new announcements in models by OpenAI, we now also have some great improvements in the GitHub Copilot tooling.

It already upgraded to GPT-4o, but now it allows for a larger context in the chat. This allows for processing larger files and having longer chat conversations.

There are more updates, so read up on what has been changed.

When ChatGPT announced their o1-preview I was eager to try it out. The capabilities really impressed me and the system helped me to write complex pieces of code.

The GitHub Copilot team looked at how they could integrate this new model into the Copilot experience and compared it to GPT 4.

Recently, GitHub announced the ability to use extensions inside the Copilot chat interface. You could, for example, include a docker, sentry, octopus deploy, launch darkly, etc extension into your conversation and interact with it.

Rob Bos took the new extension model and implemented his own agent. Read how he did this and got a working agent.

Azure Updates & Insights

Directly exposing your Azure OpenAI instances to the outside world might not be the best thing to do. Adding an API Management instance in front of it offers some much-needed protection.

it can add rate limiting and distribute requests to the closest or most performant OpenAI instance. And what about a circuit breaker, to protect an overloaded backend instance? The below article describes how to set this up using Bicep.

As one of the largest data center operators in the world, it is keen to be energy efficient. Microsoft has set itself goals to be carbon-negative in 2030, but the advancements in AI mean a high power demand.

That offers both opportunities as well as challenges. In the below blog post, they share how to efficiently work with AI to reduce consumption. For example by aligning the use of AI features during the day so AI training tasks are run at off-peak hours.

.NET Nook

The upcoming .NET 9 release promises again nice increase in performance. But how much is there to gain?

Stephen Toub goes deep into the internals and gives you all kinds of test results to show how this new version is again an improvement.

I like reading through other people’s code and seeing how they tackled a certain problem. In this case, CQRS with command handles in combination with the Marten framework.

Jeremy Miller uses an example to show how to implement a simple command handling system and projection system.

Closing Thoughts

Thank you for reading this week’s edition!

Your feedback is invaluable, so if you have any thoughts, questions, or suggestions, please don't hesitate to reach out by simply replying to this email.

If you enjoyed this update and want to continue receiving more, make sure to subscribe here.

I appreciate your time and look forward to hearing from you!

Did you like this edition?

Login or Subscribe to participate in polls.

Reply

or to participate.