Skip to content

Instantly share code, notes, and snippets.

@MapaX
Created April 26, 2021 10:45
Show Gist options
  • Save MapaX/5dc58ccb16ad1f772907154ae4991dca to your computer and use it in GitHub Desktop.
Save MapaX/5dc58ccb16ad1f772907154ae4991dca to your computer and use it in GitHub Desktop.
Shell script to create xcframeworks from MLKit frameworks
#!/bin/zsh
makeXCFramework () {
BASEDIR=$(pwd)
echo "Script location: ${BASEDIR}"
LIBNAME=$(basename $BASEDIR)
echo "lib is: $LIBNAME"
cd Frameworks
mkdir -p iphoneos
mkdir -p iphonesimulator
# Copy framework into the platform specific directories
cp -R $LIBNAME.framework/ iphoneos/$LIBNAME.framework
cp -R $LIBNAME.framework/ iphonesimulator/$LIBNAME.framework
xcrun lipo -remove x86_64 ./iphoneos/$LIBNAME.framework/$LIBNAME -o ./iphoneos/$LIBNAME.framework/$LIBNAME
xcrun lipo -remove arm64 ./iphonesimulator/$LIBNAME.framework/$LIBNAME -o ./iphonesimulator/$LIBNAME.framework/$LIBNAME
xcodebuild -create-xcframework -framework iphoneos/$LIBNAME.framework/ -framework iphonesimulator/$LIBNAME.framework/ -output ../../"$LIBNAME.xcframework"
cd ..
cd ..
}
cd MLKitCommon
makeXCFramework
cd MLKitVision
makeXCFramework
cd MLKitTextRecognition
makeXCFramework
@AngryDuckFTW
Copy link

  1. Create new folder
  2. Create Pod file inside the new folder with the following content
platform :ios, '10.0'
use_frameworks!

plugin 'cocoapods-binary'

pod 'GoogleMLKit/FaceDetection', '0.64.0', :binary => true
pod 'GoogleMLKit/ImageLabeling', '0.64.0', :binary => true
  1. Open terminal and change to the created folder and run pod install
  2. Copy makeXCFrameworks.sh into Pods folder
  3. Change to Pods folder in terminal
  4. Call ./makeXCFrameworks.sh (current dir should be Pods)

After that I have all XCFrameworks in _XCFrameworks like in my screenshot above -> ML* , Google*, nanopb, etc.
If you have some problems you can enable the debug echo and maybe you found something that is not right on your side.
-> If all works change the pod file for your needs.

ive just tried that and its complaining a lot less now, but after its finished I still only have the ML frameworks, every other one says this on terminal when it gets to it "error: binaries with multiple platforms are not supported"

Did you have the latest Xcode version 12.5 and cocoa pods 1.10.1 installed ?

I never saw this kind of problem on my side

Im using Catalina 10.15.7 and Xcode 12.4. Im updating to Big Sur now and then installing 12.5 so hopefully that will make it work, ill check back in when its all updated

@MapaX
Copy link
Author

MapaX commented May 5, 2021

I got the swift package manager version of ml vision working. (Contains only the MLKitTextRecognition, not all libs)
Here is gist for the Package.swift: https://gist.github.com/MapaX/8c9b47b1683ef188eaf30d8b2d9d03f1
Here is image about the folder structure. It requires one dummy source file, so it bundles correctly.
Screenshot 2021-05-04 at 16 16 36
The MLKit .frameworks do not contain Info.plist files to those need to be generated manually under the .framework directories before installing to device happens correctly. (It will complain on startup that Info.plist is missing)

I saw that firebase use some of the same dependencies and the package file contains the links to the packages.
Maybe that can be used instead of precompiled frameworks

https://github.com/firebase/firebase-ios-sdk/blob/master/Package.swift

E.g.

 .package(
      name: "SwiftProtobuf",
      url: "https://github.com/apple/swift-protobuf.git",
      "1.15.0" ..< "2.0.0"
    ),

I tried to use the protobuff from swiftpm, but there is something hardcoded to libs, those require .framework versions of libs and if you use the package, it will be in different form.

That is also the reason why the FBLPromises is bundled in with different name. So if you have firebase in use, there will be two versions of FBLPromises in the release build. One as FBLPromises.o and one as FBLPromises.framework.

I think this is something what needs to be fixed to the GoogleUtilitities.framework.

@nilsnilsnils
Copy link

I created my own MLKit Package.

  1. Read Readme.md and create Package
  2. Compile for Sim
  3. Compile for device over Xcode
  4. Upload archive to Apple for Testflight. Some errors only occur during the upload. See Readme.md
  5. Test your App using the Testflight version

Readme & Tools

@Alesete
Copy link

Alesete commented Dec 16, 2021

Hi! Thanks for this workaround, you save my day!

I had to make a fix in the script related the issue @AngryDuckFTW commented here. The only frameworks generated was the MLKit ones.

I found that the error "binaries with multiple platforms are not supported" was produced in the Google's dependencies because they have the armv7 architecture inside the frameworks (not only x86_64 and arm64), so the simulator fails because it has 2 architectures (x86_64 & armv7) of 2 different platform (device & simulator).

The fix in the script is quite easy. add the following line:

xcrun lipo -remove armv7 ./iphonesimulator/$LIBNAME.framework/$LIBNAME -o ./iphonesimulator/$LIBNAME.framework/$LIBNAME

below line 49.

I had also to change line 100 from this

if [[ $FrameworkBaseFolder == MLKit* ]] then

to this

if [[ $FrameworkBaseFolder == ML* ]] then

because I have MLImage as a dependency also

Thanks again, and best regards

@Mcrich23
Copy link

Mcrich23 commented Apr 5, 2022

How would I apply this to GoogleMLKit/Translate?

@MapaX
Copy link
Author

MapaX commented Apr 5, 2022

I think the easy way is to clone the https://github.com/nilsnilsnils/MLKitFrameworkTools repo, tune the Pod file and just follow the instructions in there.

I have not checked what the translate contains, but same approach should work just fine.

@huuchi207
Copy link

can anyone give me xcframework file of GoogleMLKit/FaceDetection. I tried the following script but in output folder, there is no framework at all :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment