20: SPI in 2049, new “Supporters” page, and MacWhisper audio transcription
So I've done the math. If people stop adding new packages today,
we can run this show for another 26 years before we run out of packages to recommend.
Although if last night's script that checks for renamed and moved packages has its way,
we'll have zero packages next week because it tried to remove about 95% of our packages yesterday.
Yeah, that's it folks. We had a good run. Fortunately, there are no more packages.
They're all gone.
- So, which one of those is gonna win, is my question.
- What do you mean, of the ones that we recommend or?
- No, it's will we either go 26 years and then run out,
or will it succeed and remove all the packages tomorrow?
- Well, I think that's for us to merge, right?
We could have been done already if we were merge-happy.
So, so--
- Yeah, it's a good time we should take the diff.
- Yeah, yeah, no, it's great to merge.
So what do you think will our SPI build matrix look like in 2049?
Well, it's funny you should say that because one package that I was looking at this week
was a package called Relax by Thomas De Leon.
And I know we're not doing package recommendations yet, but because you mentioned that, I thought
it was worth bringing up because so the package itself is a REST API client for Swift.
I'm sure it's great, but that's not really what I want to talk about around it.
What was interesting is that Thomas in the Read Me does his own supported platforms matrix,
which is not dynamic, of course, it's just written.
But it made me think about that, our matrix and how it doesn't quite capture everything.
And of course, it's never going to capture everything, that was not his intent.
But there was something quite subtle in what Thomas did that I thought was worth talking
about and that was he's got each platform and then the minimum version of the platform
that the package supports and it needs Swift 5.7 but then actually the platform that it
can deploy to can be a package previous to the Xcode that shipped with 5.7 if you're
with me.
I am not but…
You're not with me, okay.
- What could go wrong?
I'm sure the audience is, I'll just be the rubber duck.
- No, I'm sure nobody is.
I'm sure nobody is.
Let me explain it a different way.
You need Xcode 14 to compile this package
because you need Swift 5.7
and the only version of Xcode that comes with Swift 5.7
is Xcode 14.
But then you could deploy back to iOS 14 or macOS 12
because even though it needed 5.7 to compile,
the deployment version can actually be previous
to the current version of iOS and macOS, of course.
And it's a really subtle,
in fact, it's so subtle that it slipped past you,
but it's a really subtle,
but quite important distinction to make,
and it's not even something we attempt to capture.
- Right, and that's possible now
because of ABI stability, right?
Because before, that was really closely tied.
You couldn't back deploy to an older platform version.
So I think, I mean, we do our best with our platform matrix, but it does show that even
with the amount of work that goes into creating that platform matrix, we can't possibly represent
the real world compatibility of every package.
I wonder if, would it make sense to try and, I mean, so what I'm getting at, there's a
minimum Swift version you need for development and one you potentially need for deployment.
Most of the time they're probably the same, but I wonder if it's worth having you like a little explainer,
what the version is that we're showing there, which is a development version. I don't know.
Right. Yeah. The one thing I don't want to do is make that matrix any more complicated than it currently is.
Yeah. Yeah. I think that's about as much as it should be.
plus, I mean, it's, you shouldn't take that as gospel anyway, because sometimes it doesn't
take much to expand compatibility or Swift, you know, either platform or Swift version.
Swift version is probably easier to expand because, you know, especially in the 5X, if
something runs on, on 5.6 and you need 5.5, you know, if you, if you spend a little time,
you can probably make it work.
I think await is probably the biggest showstopper there.
that's quite easy to figure out. But
wasn't async/await also backported back
to iOS 14 or 15 maybe? 14 I think? Well
I think it was in 5.5 to begin with,
so that's probably a bad example, but
before that I don't think it was back
deployable beyond 5.5. I'm not entirely
sure. I need to check our matrix. So
yes, on your question of where will our
our matrix be in 26 years or however long it was, it will be much more
complicated but at the same time still won't capture everything.
Yeah, so what version are we up to then? Is it like Swift 10?
Well they're not, yeah, they're not really going. So iOS has a very stable,
predictable release schedule. Every year goes past we get a new version of iOS.
there's a neck. We add one to the number, Swift is not playing the same game as iOS.
Now it looks like it's a two patch versions per year and typically, well,
4 didn't go up that high, right? 4, what was the highest 4x? 4.
It was 4.2 the last one.
Well I'm sure Swift will still be around in 26 years if Objective-C is any...
Well, Objective-C will probably still be around.
Well, Objective-C had almost no changes for 20 years.
Yeah, fair point.
Actually, that's not true. That's a little unfair. But it certainly didn't change. It
changed less in 20 years than Swift has changed in five.
Yeah, yeah, that's for sure. Yeah, interesting to think about, isn't it, to try and extrapolate.
I mean, it never works, really. There's shifts coming. Who knows what it's going to look
like in 26 years. But platform-wise, we'll probably see a few changes. I mean, there's
RealityOS is out there.
That's probably going to be another,
you know, maybe this summer, maybe next summer,
there's going to be a new platform.
We'll probably add Windows at some point, Wasm.
So there's a lot of change coming to that aspect
of the build matrix, probably fairly soon.
- Yeah, and I think Windows support in the platform matrix
is something we should tackle reasonably soon.
I'm not going to put a time period on it,
But I certainly, I was having some conversations
with people down at the ServerSide Swift conference
and that topic came up several times.
And I think it's something that we should at least
take a quick look into.
- Yeah.
- How does our build a tool,
does it even work with Swift on Windows at the moment?
I'm guessing it probably does, but who knows.
And then, I mean, I know that for me,
it's a very long time since I administered a Windows server.
and I'm sure it's the same for you.
- We probably won't have to.
I've just very recently thought about this a bit
and that was because there's a evolution,
I think it's a pitch, it's not quite a proposal yet,
about cross-compiling.
- Oh, I did see that, yeah.
- And bringing full toolchain support for cross-compiling.
And I think that's the way we should try and do it,
if at all possible, because if the results,
and they should be reliable,
I mean, it's part of the toolchain, right?
We should be able to rely on that
and then it'll take away a lot of the headaches
around different platform support.
I mean, we might even then consider building Linux on Macs
or vice versa, but it'll open up possibilities
to simplify the way we run the platform compatibility
while also expanding it and not going crazy
because adding a different platform is one thing,
but then managing the capacity
because those builders will only then be able to build
that one platform and that is terrible
for capacity management, you know,
like utilization and to make sure
that you're not over or under provisioned
on a certain slice.
We had that problem early on when only,
or even right now, we have only one builder
that could build 5.7 because of how the macOS versions run
and Xcode versions that require certain macOS versions.
And that's a problem then if one builder isn't redundant
and it has struggles or has some outage or something,
we rack up builds that aren't processed.
And then because we only allow a certain depth
of the build pipeline before we stop enqueuing new builds,
one builder going down and filling up the pipelines
with unprocessed builds for a certain platform
will block the whole thing.
And the more we have the ability to have builders
be agnostic and be able to build any sort of swift version
or platform, the better our redundancy,
utilization across the board. So I
haven't looked into this proposal more
than knowing it existed. Are you saying
that you could build all of the Apple
platforms on Linux at that point? Apple
platforms I'm not a hundred percent sure
because of the SDKs. Of course. That is
probably a limitation that you will
still, well you will still need the platform
for, but I think the other way
around, Apple platforms should be able to
build all the others because they won't
have any SDK dependencies, right?
I mean, more than beyond what we see already, right?
We have some Linux packages that need certain SDKs
on Linux that we might have to provide per platform,
but that's sort of a different problem.
But yes, potentially we would be able
to just have Mac builders and run, you know,
any kind of build off of those.
- The reason I asked the question the other way around
was because one of the nice things
about our Linux build environment
is it's entirely Dockerized.
And the Mac builders are not, they just run versions of Xcode sitting on a Mac operating system.
And so there is that slight advantage of having dockerized builds because it does isolate the build a little bit more than we currently do on our Mac build machines.
To be honest, I think the package we talked about or the tool we talked about last time, Tart,
is something that I still have in the back of my mind to solve that part on macOS as well.
Right.
Because with Ventura and Big Sur, we're already arriving at a place where soon Macs will be able to run
at least virtualized any, so we will be able to run any Xcode version that we support on any Mac
because we'll always be able to virtualize the required OS,
you know, that needs to come in tandem with the Xcode version.
So we'll probably be able to provide images
that any of the builders can pull,
just like our Linux builders pull the Docker images right now,
and then run the builds virtualized.
And I think in all my testing,
I haven't seen any big build penalty in doing that.
And I'm really, really hopeful that this will settle down
and allow us to actually do that across the board.
And that would actually make maintaining these machines much easier because all
we need to do is set up base images, just like we do for Linux right now, set up
base images with macOS version and an Xcode version, and those would be pulled
by the builders for whatever Xcode and Swift version they're interested in
building, and that way there isn't any managing of which builder can build
what Swift version, any will be able to build any.
And we've talked about this in the past and I was a little hesitant to, to, to go
down the virtualization route.
And I, after looking at Tart, I think we could definitely do it, but what I'd be
keen to keep is either we do builds virtualized or we do builds just on the
machine rather than trying to mix anything where like Xcode 14 builds just operate on
the machine because they're in Ventura.
Yeah.
And so the machine itself wouldn't even necessarily,
in fact it wouldn't have Xcode installed on it.
It would just be a host.
- Yeah, yeah, exactly.
Yeah, I mean, absolutely, I'd agree.
I mean, there's probably going to be a phase
where we test it and don't fully commit
and have a couple of running initially to see how it works.
But once it does, I absolutely agree.
I mean, what's the point of streamlining that
when you still run a mix of bare metal
and virtualized machines?
And just for consistency.
- Yeah, it'll make managing the machines easier.
I mean, you probably, you know,
you still need to also update the base machine,
but at least it's only an OS update then,
which you probably don't need to be
as up to date necessarily.
But other than that, it's managing base images,
which get deployed across the board automatically.
So there's no managing of, you know,
five Macs or however many it is individually
and export versions.
So yeah, let's see how that goes.
- What we will need though is at least one more Mac builder
to test that on because we're currently,
if we take any of those build machines out of rotation,
we lose the ability to build certain versions of Swift.
- Yeah, can't have enough Macs.
- That's right, order one more.
- Well, we've almost transitioned into news, haven't we?
Is there anything else we want to cover?
I put a new page up on the site today.
It got merged in this afternoon.
And it's a thank you page.
So it's a thank you page on the site,
which is saying thank you to,
I mean, we've talked a little bit about this last episode,
saying thank you to the corporate sponsors
who support the site,
our infrastructure sponsors who support the site,
and then the hundred and so community sponsors.
And so we have this page now
where we have the corporate and the infrastructure sponsors at the top,
and then everybody's GitHub profile and name and username
if they also support the site through GitHub sponsors.
So that went live this afternoon.
That's a nice little thank you to all the support that we get.
Yeah, it's great.
It looks really great with all the avatars on it to show who all is sponsoring.
Really great to see.
Well, and that was the one quite nice thing when I was developing it,
is it does take a long time to scroll to the bottom.
- Yeah, that's like, make the list longer everyone.
- Yes, in fact, there is, okay, so here's a little secret
that most people probably wouldn't see.
There is a little secret Easter egg
at the bottom of the page.
- Oh, I did not see that.
I need to check that out.
- Even Sven has not scrolled to the bottom.
- Right, there's another bit of news.
We started adding transcripts to our episodes recently
and there's a really nice tool I wanted to give a shout out to
which is called Mac Whisper by Jordi Bruin,
which is a Mac wrapper, a fixed-width UI wrapper
around the Whisper transcription tool by OpenAI.
And I really like this.
This is the kind of API, AI that I can get behind
because it's pretty, I mean, it is straightforward
what it does, it ingests audio and transcribes it so it takes tedium out of a chore and
gives you a really great result. So it allows different quality settings and in the highest
setting it really produces remarkable results like identifying laughter and even transcribing
that into the text and identifying terms of art, you know like we talked about Graviton,
on AWS and all that stuff came out really great in the last episode.
It's also great to warm up your MacBook on a winter morning when it's way too cold to
the touch and because if you run this for a while, even an M1 Mac will get warm and
after five minutes the fans will start kicking in, you know, just a bit.
So it's really nice.
The speed is also remarkable.
line it does take a while but it transcribes at about 2x on an M1 Max and that's with 60%
utilization it could probably go a bit faster I'm not sure what the limit is or the limiting
factor that there is but it's a really great tool if you have audio or I think it also
does video you could just drag the file into the window and it'll go off and give you timestamped
transcription, so it gives you files that you can then, we added this to our
YouTube stream as well and then it has subtitles to the stream. I mean there's
no video of us talking there but you get like subtitles on the static
image because it's timestamped so that's it's a really really great tool.
The results are fantastic and when you say 2x do you mean double speed or half speed?
So double speed. Double speed, so it takes half as long as the actual episode to transcribe it.
Yeah, exactly. Our 45 minute episode took like, I didn't actually time it exactly, but
after 10 minutes of running it had done half, so I think it's, I think it's roughly, it
would have taken like 20-ish minutes, 20 minutes to do the whole 45 minutes.
The ironic thing there is that when I export the podcast audio, it is at less than 1x at
it takes longer to export than it does to play back wire one-to-one speed.
And I think the reason for that is the filters that we add on to clean up the
audio and make it sound reasonable. And I think it's those that are taking the
very long time, but it certainly does take a long time to export the audio.
Interesting. You've got an M1 Max as well, right? I do. It's time to
upgrade to an M1 Ultra. Another bit of news, we've been talking about the dock
uploader the last couple episodes even and this is now live. We have switched
over thanks to a little help. So after the last episode someone got in touch
and helped us out to speed the whole uploading process a bit where we upload
from the Laminar to S3. So that's that's now running faster. It does need a couple
retries sometimes because we have, we try and upload so many files in such a short while that
the S3 gives us great limits. I think you can actually request these to be lifted, that's
probably something to look into. I heard there's ways to get extra bandwidth or something or extra
allowance, but it's certainly workable right now. We're not quite up against the lambda limit
as tightly as we were before, so that's looking really good. This whole thing is a typical example
of something that got complicated real fast.
I mean, the thing is, experience only really ever reduces
your level of surprise, but you never really get to the point
where you anticipate all the complication,
all the complicated stuff you end up dealing with.
But there you go.
It's certainly been a way longer task
than I thought at the start.
I think this started in December, right?
We started, yeah, it must have been mid-December
when this whole piece of work started.
There we go, it's live now.
And I think the other bit of news on that is that it's not only live,
because I think last time we talked about it,
we talked about only enabling it for very large packages.
But if I understand correctly, it is now live actually for every package.
So every package goes through that Lambda upload process, I think.
Yeah, that's correct.
Yeah, which is good, because that was always the eventual plan,
for you know, to get it to there eventually.
in reality, we've got there very quickly,
but it's never great to have two completely different code paths
that something would go through.
And so it's good that we just have,
this is the way that documentation gets uploaded now.
Yeah, absolutely.
And the other thing is because these are different components
which ping back to the server.
And, you know, you already have two different components
that ping back in two different ways, essentially,
and then also in two different code paths,
the legacy code path and a new one.
And it starts getting really confusing under which conditions,
you know, which code path will be hit and ping back and managing all the versions.
You know, the complexity here is really in that these are three components
talking to each other in certain ways and managing how you,
in which order you can deploy them so they don't break and that sort of stuff.
It's always a bit fiddly to get that right when you deploy,
you know, so you don't hit errors because stuff isn't on the right version and that sort of thing.
One thing that struck me when I was thinking about this recently is there
was a point in this project, um, in, in year one of this project where.
Deploying this app to a, a different site would have been a case of checking out
the repository, making a Postgres database and running it, and then as we added the
build system that became more and more difficult to do, and now we've got bits
executing out on Lambda as well. And it has left behind any hope of anybody else, I think,
being able to stand up an instance of this app now. It's grown too many arms and legs
to be easily deployable by anyone else, I think.
Yeah. I mean, you still can, you know, and you get quite far by just running the server
and having the website and the database so you can do what most people actually see.
But you're right, the dynamic part isn't really possible to replicate.
You can still do a large part of the dynamic updating by running the ingestion and the analysis and the,
you know, the fetching the data from GitHub and that sort of stuff.
That is still possible standalone, but the build system really isn't,
or has never really been part of that data cycle because...
Yeah, too many dependencies on infrastructure, really.
Yeah, yeah, absolutely.
Well, I guess it's a bit like Google Search, you can't run.
I'm not sure how to do that, but I guess it's just stuff you can't.
No, although, do you remember, I remember a time when Google sold a search appliance,
which was, we had one.
So the Google Search appliance was a 1U rack-mountable server.
It was bright yellow.
And you bought it from Google and you mounted it in your rack and your intranet and it would
do the Google crawler through your intranet documents and then present you with a Google
search interface of your content as well as the intranet's content.
Right, so for intranet purposes or just so you have a local searchable copy index of
the intranet?
was for searching private documents. It was for searching internal
company documents. But they didn't ship it for very long. I would
imagine they only shipped it for a few years, two or three years I think.
It was not a business that they really got into. But I do remember it
and we did, I don't know where it is now, lost in the, but it's probably been
recycled into something else by now, but we did have one of these bright
yellow Google search appliances, that's
what they were called. That's crazy, I
never heard of this. That's really
interesting. I'll see if I can find a
link for the show notes. Bring your own
Google. Right, do we have any
other news? Don't think we have any other
news, no. I think maybe it's time for
package recommendations. Let's do the
packages. I think I normally start us off,
so why don't you start us this week, Sven?
All right, I'll go for it. So the first package I want to mention is called sextant by Rocco Bowling.
And it's an interesting package. It's what's the description? High performance JSON path queries for Swift.
So JSON path queries, I didn't actually know this is what they're called. I sort of knew the syntax.
Because these are the things you can spell out if you use the jq command line tool.
Right.
If you're not familiar, jq is something you can install with Homebrew.
It's a little command line tool and you can pipe JSON into it or call it and then point it at a JSON file and it'll load that up.
But you can also, just for display purposes, I think it does syntax highlighting or coloring,
but you can also query the structure.
So you can, if it's an array element,
you can pick out a certain number or range of elements.
If it's an object, you can drill into it,
you can pick out attributes,
or if they're nested structures,
you can go into the key path sort of.
And these expressions that you can write
to do these queries, these are called JSON paths.
And this library apparently adopts that specified,
I think it's like a specification that is portable.
Certainly it sounds like it.
I didn't actually dig into it that deeply,
but from the examples that I've seen,
you can reuse these common things to India Swift code then
and explore JSON objects, you know,
and drill into them in quite a concise and easy fashion.
I think on the whole, this is a bit like Regex for strings,
you know, handle with care,
because with the succinctness of the syntax also comes
you know, the burden of maintaining it because this gets quite hard to parse quickly.
These expressions can be quite gnarly, but if you have one that you know that works,
which is a bit like RegEx, right?
If you have a RegEx that you know that works, that's a great way to get started with a RegEx on Swift, right?
Because you can just plug it in and let it do your filtering.
And the same here, if you have one that works, you can bring it in and then use it with this library to extract data.
I think one thing we could probably say about this is that it's no worse than a RegEx.
Yeah, and it seems to be really fast. One of the pitches is and actually has a performance
comparison between this implementation and a few others and it looks like this is
a lot faster than most of them. So this seems to be a really nice tool.
If you have a need for JSON paths, check this out. Sextant by Rocco Bolling.
- That's great, and I also am a big fan
of the JQ tool that you mentioned.
And one of, it's just a little tip with that JQ tool
if you have it installed,
one of the most useful things it can do
is if you're curling some JSON from somewhere
to display it on your screen,
if you just pipe it through JQ with no parameters,
it just pretty prints it for you.
And that, it's worth everything.
It's worth having on your machine
if you only ever use it for that.
- Yeah, yeah, that's great.
While we're on the topic of JSON command line tools,
there's another one I started using recently.
It's called FX and what it does,
I think it also has queries,
but what it has most of all is you can pipe stuff in
and then it becomes navigatable.
So you can fold and unfold structures and drill into them,
which is quite nice because if you have a large JSON,
it can be really hard to find your way around inside it.
And that has keyboard navigation
to explore the file quite nicely.
So that's another quick tip.
- We should start calling this podcast Unix indexing,
Unix tool indexing.
Okay, my first, well actually my first package
is a pair of packages.
I think it was the last episode that I recommended
a Markdown package that was compatible
with GitHub flavored Markdown.
Well, this week I've got two more Markdown packages.
The first one is Markdown text from Chaps Benkow
and it's similar in its kind of intention
to render Markdown natively inside an iOS
or a MacOS application.
And it uses SwiftUI for the rendering.
And what I really liked about this
is that you can customize
whenever you get a piece of Markdown
and it's about to render it into a view,
you can take control of that view
and you can just add some view modifiers for it.
So for example, if you wanted custom unordered bullets,
you could just add a .4 round color
SwiftUI view modifier and make those bullets blue.
And I think that's quite a nice technique
using actually the power of SwiftUI view modifiers,
which are a really powerful way to customize
how something looks.
And so I quite like that.
And then another one that also had a release recently,
and both of these have had releases recently,
that's how I found them, through the RSS feeds as normal.
Markdown Text was last released five days ago,
been in development for five months,
but it had a release 1.1 five days ago.
And the second package is Markdown View.
So the first one was Markdown Text,
second one was Markdown View,
and that is by Li Yanan.
And that also had a release three days ago, 1.0 releases,
been in development a similar amount of time for six months.
So the unique thing about Markdown View
is that it supports SVGs in the Markdown.
So I've never seen that in a Markdown parser before.
- Oh, nice.
- Yeah, so this has fully compliant
with Common Mark standard, so not GitHub flavored Markdown.
And Markdown text, the previous one
also isn't GitHub flavored Markdown either,
but this one does support SVG rendering.
So it feels like between the three of these packages,
we've got a really great markdown.
The problem is they are three completely different packages,
which is always a shame when writing any markdown viewer
is not going to be trivial.
And you may hit that situation where you need one feature
from each of these different packages,
whether it's GitHub flavor markdown,
whether it's easy customization with SwiftUI views,
or whether it's SVG support.
But I thought they were all three interesting ones and they solve a good problem in a lot of, you know, potentially a lot of apps.
Well, here's to the person who's going to use all three of them and then use the GitHub Markdown favorite one for the tables and the other one for colored bullet points as a patchwork of Markdown views.
You say that, but for quite a long time in the Swift Package Index Project, we did have two different Markdown passes.
one that supported GitHub and one that didn't.
And then we eventually just let GitHub render the readmes
and we got rid of our GitHub flavored one for that.
- Nice, it's great to see all these
Markdown renders coming up.
- Long live Markdown.
I say it's a format which I,
if I could write everything in Markdown, I would.
- Yeah.
My second package is called Discord BM by Mahdi Barami.
And it's, as the name implies, it's about Discord.
And it's a package to create Discord bots,
or to post on Discord, I suppose, bots,
you know, going a bit beyond in the sense that
you can also create slash commands,
which then execute things in your tool as you're writing.
The reason I kept an eye on this is,
A, there's been quite a number of beta releases recently, leading up to a 1.0 release.
But also we are probably going to switch our monitoring reporting over to Discord.
So what we're doing right now, we've hooked all our reporting whenever there's an alert
or something going on or a deployment, we're actually posting those to Telegram right now,
which is a channel that we can both check and where we see what's going on.
we're actually going to switch that over to our own Discord.
And that then allows us to see it there
and just not have Telegram running all the time,
but it'll also allow people in our dedicated Discord
to actually see those channels
and have a quick way of checking
if there's potentially something wrong,
because often when an alert fires,
I might post something there to explain,
I'm looking into it, or I think this is the reason,
and that sort of stuff.
So people can actually have more use of our Discord
by being able to see what's going on.
And obviously also our deployments are posted there.
So that's the reason I looked into it
and I found this package and it looks really nice.
I'm kind of looking for an excuse to also maybe build a bot.
I'm not sure if there's anything we could potentially do.
Maybe you could add a package
by having a command in our Discord.
That might be nice.
Put in a URL and then it goes off
and adds a pull request, something like that.
It might be some ideas there.
- There we go, yep.
- To play with it.
- It's probably worth mentioning our Discord
for people who are not familiar with it.
So our Discord is public, and the link to our Discord
is an open invite in the project's readme file.
So that's swiftpackageindex/swiftpackageindex-server
on GitHub, and if you read down the readme
or just search the readme for the word Discord,
you'll find a link to our Discord.
And if you do pop in, say hi and let us know
that you found it through the podcast
because that'll be nice to know.
But yes, it's an open Discord.
If you're interested in contributing to the project
or if you just want to chat about the project,
or, I mean, I'm sure this will never happen,
but if there's any kind of problem with the website,
you can also talk to us there about it.
- Absolutely, come and stop by.
- So my next package is from Apple, actually,
and it's a stable diffusion.
So you may have heard of stable fusion before.
It's one of these AI image generation models, or it's a tool that exercises
one of those image generation models.
And it's interesting because Apple have put some effort into, they obviously
haven't written stable diffusion, but they've put some effort into making
stable diffusion run well on Apple Silicon using Core ML.
And I just think this is quite an interesting thing for them to get involved with.
So I have lots of mixed feelings about all of these AIs.
If you've been reading iOS Dev Weekly for any amount of time, you've probably read some of my mixed feelings about all these AIs.
And I definitely do have questions about how we are potentially taking millions, if not billions of hours of human effort and using them to make some
tech companies richer, which I'm not, you know, not, not, I don't feel great about
that to be honest, but I do actually find especially the image generation
fascinating. I've been playing with them, with one of them called Mid Journey for
a little while now and I've been having a great time with that. So there's two
parts to this repository on stable diffusion that Apple have created. There
is a Python package which will convert the models to Core ML format.
And then there is a Swift package that developers can add to, you know, you
can add to your Xcode project to deploy image generation capabilities inside
one of your apps, and of course that depends on the Core ML model that comes
along it.
Yeah.
So I just think this is, this is interesting for a few reasons and it's
not something I would have expected Apple to get involved with.
Yeah, it's really interesting to see. I saw that flyby as well.
And I think it's used in some of the apps that popped up when Stable Diffusion came out, right?
There's a couple of apps on the App Store, I think even on the iPad, at least one of them that I've seen.
Yes. And this package has only been around for two months, so it's really quite new.
But of course, I'm sure it was possible to run Stable Diffusion as an open source project.
So I'm sure it was possible to run it without these Core ML optimizations.
but anything that can make it faster, anything that can make it take advantage of Apple Silicon is better.
Yeah. Yeah, I mean, it has dedicated chips, right, for that sort of stuff.
So I would imagine that should probably give it quite a good boost.
It'd be interesting to hear what the performance difference is.
I haven't seen any benchmarks or anything, but it must be considerable.
There are actually some benchmarks in the ReadMe,
but they are not benchmarks against non-optimized versions.
they are just benchmarks against different versions of stable diffusion
and different Apple Silicon iPads and laptops.
Right, right, okay.
So, not really, they're benchmarks, but not in the sense that maybe you were looking for.
Yeah, interesting.
Right, so my third pick is actually a bunch of picks.
And I'm picking the category of Mastodon client libraries.
And this sort of was triggered by a release that I've seen this week,
which is called Toot SDK by Konstantin Kostov and David Wood.
And this is, you know, as the name implies, so this is a client
library where you can interface with Mastodon.
Mastodon, obviously the sort of new-ish or at least new in the sense
that it's been recently quite popular, Twitter replacement or
social microblogging network.
And this is an SDK which allows you to log in, you know, get your login
token and then pull down the timeline, make posts and all the sorts of stuff that the
Mastodon SDK offers or API offers.
But I didn't want to mention just this one because there have actually been quite a number
of other client libraries that I've seen.
And because people asked just recently, I saw someone ask what Mastodon libraries there
are and I answered with a link to our search page, we were searching for Mastodon.
So there's currently four packages all in all that are tagged with Mastodon.
SDK being one and the most recent entry. And then there are two other packages that have been around
for around five years. One is called Mastodon Kit is by an ornithologist coder. So I'm not sure who
that person is, but it's mainly one person driving that one. The other one is called
Mastodon.swift, which is by Markus Kida and Thomas Bonk. That's also been in development
for five years and the third one is by Beylay, Mastodon API. That's been out for eight months,
but I think it has a longer history because Beylay is the author of Mastood, which is quite a popular
Mastodon client, the one I used for quite a while initially, very feature-rich client,
it's been around for a long time. So this isn't just a client that's popped up this fall,
it's been around for way longer.
Yeah, and all of these seem to be really quite mature
and good client libraries for Mastodon.
So if you're looking to write a Mastodon client,
which seems to be the Jason Parser of the 22, 23 years,
this is probably a good place to start
unless you also want to do that bit first,
go the whole way viral while implementing your client.
- I did get an email about this
from somebody who was recommending that I check it out in case it was worth a link in iOS Dev Weekly.
I have to admit, I did think for a second, I'm not sure we need any more Masters on Clients.
There can't be too many. Just bring them on. I think it's really exciting. It's great. And
what's really interesting is how different they actually are. You know, little features,
is they, I for quite a while now, I've been using a mix.
So I'm using Mona on the Mac, I'm using Ivory on iOS.
I use Masterwood when I need to edit
because you know, most of the others don't support that.
And they all have certain things that they do,
either they're the only ones that do it
or they do it better.
And it's quite interesting,
quite a rich environment there.
It's fun times.
- I think the question we need to ask ourselves
is on the package index,
Are there more Markdown view creators or Mastodon SDK packages?
It's a race.
At least the era of before JSON parsing was built into the standard libraries,
the JSON parser library was the library to write.
Yeah, that was what my reference was about.
Imagine Mastodon supported Markdown rendering.
Wouldn't that be awesome?
and you could throw that in the mix.
(laughing)
- Why doesn't it support Markdown?
That's a great question.
Anyway, before we even get into that topic,
I think we should wrap it up for this week.
So thank you so much for listening.
And like I said, when we were talking about the Discord,
please do come and join us in our Discord.
We are more than happy to have people
who are interested in the project come in there.
And yeah, we'll be back in a couple of weeks.
Yeah, stop by in our Discord and see you in two weeks. Bye bye.
Bye bye.
Okay, so, doppling the refilling.