Tag: strings

  • macOS 15.2 Sequoia text strings reference possible improved camera that could offer Center Stage support for 2025 M4 MacBook Air

    macOS 15.2 Sequoia text strings reference possible improved camera that could offer Center Stage support for 2025 M4 MacBook Air

    Apple’s rumored M4 MacBook Air, which is said to be released in 2025, could feature an improved ultra wide camera to offer improved Center Stage support.

    The upgrade was referenced in macOS 15.2 Sequoia text strings, confirmed by AppleInsider, and was noted for both the 13-inch and 15-inch sizes.

    Beyond the leak in Wednesday’s macOS release, there are also regulatory documents that have been discovered. In those regulatory documents found by AppleInsider, there’s a reference to a “Front Ultra Wide Camera.” This is likely to mean the 1080p FaceTime HD camera in the M3 version could get an upgrade.

    The Center Stage feature allows an iPad or suitably equipped Mac to use its Ultra Wide camera and follow you when you move around to keep you in the central frame. The feature was first introduced in the iPad Pro in 2021, when the M1 chip was introduced to Apple’s tablet line.

    On the Mac end, Center Stage used to be part of Control Center, though it now belongs in a new menubar icon that groups together options such as portrait mode, or reactions. This new menubar icon changes to show you when your camera is in use, or just your microphone, and it’s this icon that can effectively be obstructed.

    The Center Stage feature can be used on current MacBook Air models, with a paired iPhone being used as the camera. The feature is limited either a paired iPhone, or a suitable camera, which to date, is found only on the MacBook Pro. An addition of the wide-angle camera could also enable Desk View. Desk View uses the camera to capture then perspective-correct the area right in front of your computer to show on screen. This can be shared on a call or recorded. It too is a feature of Continuity Camera that leverages the wide-angle lens.

    Stay tuned for additional details as they become available.

    Via AppleInsider

  • Code strings found in the visionOS beta hint at forthcoming video mirroring, and other new features for the Apple Vision Pro headset

    Code strings found in the visionOS beta hint at forthcoming video mirroring, and other new features for the Apple Vision Pro headset

    Once again, it’s the code strings found in beta versions of a developer kit that reveal the interesting stuff.

    Apple’s forthcoming Vision Pro headset will apparently support screen mirroring via AirPlay or FaceTime, as indicated by code strings found in the beta 4 release of visionOS 1.0.

    The code strings are as follows:

    Select a device to mirror content to from your Apple Vision Pro

    Only one activity is available when mirroring or sharing your view through ‌AirPlay‌ or ‌FaceTime‌.

    This suggests that users will be able to mirror the headset’s display to an external monitor or TV, or share their view with others through ‌AirPlay‌ or ‌FaceTime‌. Other headsets like Meta Quest have similar features which can help to reduce the feeling of isolation of the headset user from others.

    The visionOS 1.0 beta dev kit also references a new option to reset EyeSight data. EyeSight is the feature that displays a user’s simulated eyes on the external display of the ‌Apple Vision Pro‌.

    The option reads as follows:

    You can reset EyeSight by going to Settings > People Awareness and tapping Reset Personalized EyeSight. This will remove personalized eye details from EyeSight, like your eye shape and measurements, but EyeSight will still use your skin tone where available. After you have reset EyeSight, you can restore it by recapturing your Persona.

    A “Persona” reference located in the beta indicates that a user’s Persona will be sent to all participants within a FaceTime call, which will allow other participants to view the user’s Persona. ‌Apple Vision Pro‌ can generate Personas via machine learning that allows users to share virtual representations of themselves that reflect face and hand movements in real-time with others over ‌FaceTime‌.

    Where FaceTime calls on a visionOS device are concerned, a user’s Persona data will be sent securely to all users on the call, who can then view your Persona. Once the call is completed, the Persona may remain stored encrypted on the other call participants’ devices for up to 30 days. The other call participants will be able to access your Persona only when they are on a call with you.

    The code references also indicate that Personas, but not the data used to generate them, could be stored on Apple’s servers, though securely in a manner inaccessible to Apple.

    To create your Persona and personalized EyeSight, ‌Apple Vision Pro‌ cameras capture images and 3D measurements of your face, head, upper body, and facial expressions. The data used to build your Persona and EyeSight do not leave your device. Your Persona may be stored on Apple servers, encrypted in a way that Apple cannot access.

    Other new alerts referenced in the beta include the following:

    • Calling unavailable while in Travel Mode
    • Brighten your lighting to use your Persona.
    • This video has excess motion, and could cause discomfort if expanded.

    Apple has stated that the Apple Vision Pro headset will launch in the U.S. come early 2024 prior to expanding to other countries at later dates.

    Stay tuned for additional details as they become available.

    Via MacRumors

  • Developer confirms 4GB of RAM in upcoming third-generation iPhone SE

    Developer confirms 4GB of RAM in upcoming third-generation iPhone SE

    As word that the third-generation iPhone SE is en route, assorted writers and developers are beginning to dive into the technical specifications of the upcoming handset.

    Among key features such as an A15 processor, 5G support, longer battery life, improved camera, and more durable glass, developer Moritz Sternemann has confirmed that that third-generation iPhone SE will feature 4GB of RAM, as opposed to the 3GB found in the previous model. This information is sourced from strings within the Xcode 13.3 Release Candidate that Apple released following its “Peek Performance” event on Tuesday.

    Photo and video editing apps can benefit from access to more RAM, as they are able to keep more layers stored in memory. Additional RAM can also allow more apps and pages loaded in Safari to remain active in the background.

    The updated iPhone SE will be available to pre-order starting at 5 a.m. Pacific Time on Friday, March 11 in the United States, Australia, Canada, China, France, Germany, India, Japan, the United Kingdom, and more than 30 other countries and regions. Deliveries to customers and in-store availability will begin Friday, March 18.

    Stay tuned for additional details as they become available.

    Via MacRumors and @strnmn

  • iOS 13.1 code references hint at Smart Battery Case for 2019 model iPhones

    Once again, a group of code strings may have hinted towards an upcoming product.

    Not surprisingly, a series of code strings referencing a possible Smart Battery Case for the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max has been discovered within the beta code for iOS 13.1. The model numbers found in code of the under-beta operating system are A2180, A2183, and A2184. 

    While there’s no indication as to what models the numbers refer to, the model numbers seem remarkably close to the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max, though which model number relates to which iPhone variant is still a mystery at this time. 

    It’s unknown as to when Apple will introduce the next wave of Smart Battery Case models, although the company released previous versions of the Smart Battery Case for the iPhone XS, XS Max, and iPhone XR in January of this year, it could be quite a while before the cases are available to purchase. 

    Stay tuned for additional details as they become available.

  • iOS 13 beta code strings offer direct hints at Apple’s Augmented Reality glasses, project

    With Apple’s announced media event on September 10th, the company may have its augmented reality (AR) glasses coming down the pipeline sooner than expected. A discovery in an internal iOS 13 development build suggests Apple hasn’t stopped developing its rumored augmented reality (AR) glasses, revealing some details about the unreleased device.

    The AR glasses are currently thought to be unveiled in 2020 at the earliest.

    Recent builds of the iOS 13 betas have seen new AR tricks, which helps indicate that Apple is laying the foundation of what will eventually become the operating system for the AR device. The code is thought to be based on iOS, and are thought to function like a companion device to an iPhone, requiring a connection to an iPhone to an operate, as the Apple Watch currently does.

    iOS beta testers noted a startester app that can switch in and out of head-mounted mode “presumably to replicate the functionality of an augmented reality headset on an iPhone for testing purposes.” The test includes two modes, “Worn,” and “Held,” which lends credence to the idea that this is a test function for the AR glasses.

    An internal “README” file in iOS 13 notes a “Starboard” shell system for stereo AR apps, which could indicate the use of a headset. Other code strings with names like “Views” and “Scenes” an “ARStarboardViewController” and “ARStarboardManager” mention AR functionality even more directly.

    Stay tuned for additional details as they become available.

    Via Boy Genius Report, MacRumors, and iDrop News