The 2024 iPhone SE 4 May Already Be Canceled

As its Pro line rises in popularity, Apple may forget about its budget option.

If you could get the same performance as an iPhone 14 for roughly half the price, wouldn’t you think it was worth getting? According to analyst Ming-Chi Kuo, Apple may scrap the next version of its wallet-friendly iPhone SE, which was expected to arrive next year.

“The supply chain has received instructions from Apple indicating that the production and shipment plans for the 2024 iPhone SE 4 have been canceled rather than delayed,” Kuo wrote in a Medium post on Jan. 6.

If Kuo’s predictions are correct, only Apple knows the reasoning behind the cancellation. But research and media reports suggest there may be a very simple answer. More expensive iPhone models tend to be more popular, possibly leaving little incentive for Apple to continue pursuing cheaper phones like the iPhone SE.

Apple didn’t respond to CNET’s request for comment.

The third-gen iPhone SE, which debuted in March, starts at $429 and is essentially an iPhone 8 stuffed with the guts of an iPhone 13. CNET’s Patrick Holland called it a “mind-blowing value” in his iPhone SE (2022) review, but he also highlighted missing elements that make it feel dated, like its lack of a full-screen design or night mode for the camera.

Previous generations of the iPhone SE took a similar approach; the 2020 version essentially had the same design but had the iPhone 11’s chip. The 2016 model had the body of an iPhone 5S with the processor of an iPhone 6S. A rumored fourth-gen iPhone SE was expected to adopt a new design similar to the iPhone XR, according to YouTube personality and gadget leaker Jon Prosser.

To understand why Apple may have canceled its next iPhone SE, consider the changes Apple recently made in its iPhone 14 lineup. In a departure from the prior two years, Apple did not release a cheaper “Mini” version of its new flagship iPhone in 2022. Instead, it released a larger version of the regular iPhone 14 called the iPhone 14 Plus, which is $100 more expensive than the regular $799 iPhone 14, raising the barrier to entry for shoppers.

This change followed reports from Nikkei Asian Review and Kuo (via MacRumors) that Apple would scrap the iPhone Mini in 2022, with the former adding that the Mini model didn’t resonate with consumers. Taken together, the elimination of the Mini and reports that next year’s rumored iPhone SE may have been axed could indicate that Apple is pivoting away from releasing smaller, cheaper iPhones.

But Gene Munster, managing partner for tech investment firm Loup and a longtime Apple analyst, doesn’t see it that way. He believes the iPhone SE still plays an important role in Apple’s lineup because of its low price.

“It’s still, I think, the best bang for your buck when it comes to an iPhone,” he said.

Evidence suggests that more expensive iPhones tend to be top sellers, which might help explain the shift. The average selling price of an iPhone rose 7% year over year in the third quarter of 2022, according to Counterpoint Research. The market research firm also reported in June that Apple dominated the global market for premium phones with a 62% share in the first quarter of 2022. It’s important to note that Counterpoint defines premium phones as devices that cost $400 and higher, meaning the third-gen iPhone SE would fall into that category since it starts at $429. The report, however, did say that Apple’s growth in the premium category largely came from the iPhone 13 lineup.

The pricier iPhone 14 Pro and Pro Max also appear to be the stars of Apple’s new smartphone lineup so far. The Information reported that Apple cut production of the iPhone 14 Plus within two weeks of its launch, while research firm Trendforce says Apple boosted production of the iPhone 14 Pro and Pro Max. Kuo also said on Twitter in September that the iPhone 14 Pro Max was responsible for about 60% of Apple’s order increase for the Pro lineup, hinting that Apple’s most expensive new iPhone is also its most popular. A November 2022 report from Wave7 Research also said 38 out of 39 surveyed carrier service representatives said they didn’t have in-store inventory of the iPhone 14 Pro or Pro Max.

The iPhone SE, meanwhile, has only accounted for roughly 5% to 8% of quarterly US iPhone sales over the past two years, said Josh Lowitz of Consumer Intelligence Research Partners. When looking at total volumes for the newest iPhone models in a given year, the Pro versions typically account for 35% to 40% of sales, while the iPhone SE makes up about 20%, according to Munster.

Carrier discounts have also made it easier to snag high-end phones at cheaper prices, especially when trading in an old device, as my colleague Eli Blumenthal pointed out following the iPhone 13’s launch. Many US shoppers also pay for their phones in monthly installment plans through carriers, which can make higher prices easier to swallow. Both of those factors can make the case for buying a less expensive iPhone with fewer features all the more challenging.

“Because you’re paying over such a long period of time … the $100 price difference, or even a $200 price difference, isn’t that much per month,” said Lowitz.

At the same time, Samsung has seen success in the market for lower-priced phones. Its Galaxy A phones, which typically cost hundreds of dollars less than its flagship Galaxy S phones, accounted for 58% of Samsung’s smartphone unit sales in 2021, according to Counterpoint Research data previously provided to CNET. Samsung’s Galaxy A phones have modern features not found on the iPhone SE, such as a camera with multiple lenses and larger screens, although they often run on less powerful processors than Samsung’s pricier phones.

The notion that Apple may have canceled the 2024 iPhone SE raises questions about the future of the SE line in general. Apple extended the SE branding to the Apple Watch in 2020 for the first time and launched a sequel to that product in September.

The fact that Apple had brought the SE line to another product had made it seem like a more permanent fixture in Apple’s lineup. But if Kuo’s insight turns out to be accurate, the iPhone SE’s days may be numbered. And once you take a closer look at how Apple’s smartphone lineup has changed and the data around phone shipments, it’s easy to understand why.

But Munster isn’t convinced that cheaper iPhones like the SE are going away for good. Having a more affordable option makes it easier for Apple to achieve its broader goal of bringing more customers into its web of products and services.

“I think that plays an important role,” he said. “I don’t think Apple’s giving up on that price point.”

How Pixel Binning Makes Your Samsung, Apple and Google Photos Better

Flagship phones rely on this technology to offer good low-light performance when it’s dark and high-resolution photos when it’s bright. Here’s how it works.

Megapixels used to be so much simpler: A bigger number meant your camera could capture more photo detail as long as the scene had enough light. But a technology called pixel binning now universal on flagship smartphones is changing the old photography rules for the better. In short, pixel binning gives you a camera that offers lots of detail when it’s bright out, without becoming useless when it’s dim.

The necessary hardware changes bring some tradeoffs and interesting details, though, and different phone makers are trying different pixel binning recipes, which is why we’re taking a closer look.

Read more: Check out CNET’s Google Pixel 7 Pro review, iPhone 14 Pro review and Galaxy S22 Ultra review

Pixel binning arrived in 2018, spread widely in 2020 with models like Samsung’s Galaxy S20 Ultra and Xiaomi’s Mi 10 Pro, and arrived on Apple and Google hardware with the iPhone 14 Pro and Pixel 7 phones in 2022. The top-end model from Samsung, the Galaxy S22 Ultra, features a 108-megapixel main camera sensor, and pixel binning could take the next technological leap with the S23 Ultra’s expected 200-megapixel main camera set to debut Feb. 1.

Here’s your guide to what’s going on.

What is pixel binning?
Pixel binning is a technology that’s designed to make an image sensor more adaptable to different conditions by grouping pixels in different ways. When it’s bright you can shoot at the full resolution of the sensor, at least on some phones. When it’s dark, sets of pixels — 2×2, 3×3, or 4×4, depending on the sensor — can be grouped into larger virtual pixels that gather more light but take lower resolution shots.

For example, Samsung’s Isocell HP2 sensor can take 200-megapixel shots, 50-megapixel shots with 2×2 pixel groups, and 12.5-megapixel shots with 4×4 pixel groups.

Pixel binning offers another advantage that arrived in 2020 phones: virtual zoom. Phones can crop a shot to only gather light from the central pixels on the iPhone 14 Pro’s 48-megapixel main camera or the Google Pixel 7’s 50-megapixel camera. That turns a 1x main camera into 2x zoom that takes 12-megapixel photos. It’ll only work well with relatively good light, but it’s a great option, and 12 megapixels is the prevailing resolution for years now, so it’s still a useful shot.

With such a high base resolution, pixel binning sensors also can be more adept with high-resolution video, in particular at extremely high 8K resolution.

Pixel binning requires some fancy changes to the sensor itself and the image-processing algorithms that transform the sensor’s raw data into a photo or video.

Is pixel binning a gimmick?
No. Well, mostly no. It does let phone makers brag about megapixel numbers that vastly exceed what you’ll see even on professional-grade DSLR and mirrorless cameras. That’s a bit silly, since the larger pixels on high-end cameras gather vastly more light and feature better optics than smartphones. But few of us haul those big cameras around, and pixel binning can wring more photo quality out of your smartphone camera.

How does pixel binning work?
To understand pixel binning better, you have to know what a digital camera’s image sensor looks like. It’s a silicon chip with a grid of millions of pixels (technically called photosites) that capture the light that comes through the camera lens. Each pixel registers only one color: red, green or blue.

The colors are staggered in a special checkerboard arrangement called a Bayer pattern that lets a digital camera reconstruct all three color values for each pixel, a key step in generating that JPEG you want to share on Instagram.

Combining data from multiple small pixels on the image sensor into one larger virtual pixel is really useful for lower-light situations, where big pixels are better at keeping image noise at bay and capture color better. When it’s brighter out, there’s enough light for the individual pixels to work on their own, offering the higher-resolution shot or a zoomed-in view.

Pixel binning commonly combines four real pixels into one virtual pixel “bin.” But Samsung’s Galaxy S Ultra line has used a 3×3 group of real pixels into one virtual pixel, and the South Korean company is likely to adopt 4×4 binning with the Galaxy S23 Ultra.

When should you use high resolution vs. pixel binning?
Most people will be happy with lower-resolution shots, and that’s the default my colleagues Jessica Dolcourt and Patrick Holland recommend after testing the new Samsung Galaxy phones. Apple’s iPhones won’t even take 50-megapixel shots unless you specifically enable the option while shooting with its high-end ProRaw image format, and Google’s Pixel 7 Pro doesn’t offer full 50-megapixel photos at all.

The 12-megapixel shots offer better low-light performance, but they also avoid the monster file sizes of full-resolution images that can gobble up storage on your device and online services like Google Photos and iCloud. For example, a sample shot my colleague Lexy Savvides took was 3.6MB at 12 megapixels with pixel binning and 24MB at 108 megapixels without.

Photo enthusiasts are more likely to want to use full resolution when it’s feasible. That could help you identify distant birds or take more dramatic nature photos of distant subjects. And if you like to print large photos (yes, some people still make prints), more megapixels matter.

Does a 108-megapixel Samsung Galaxy S21 Ultra take better photos than a 61-megapixel Sony A7r V professional camera?
No. The size of each pixel on the image sensor also matters, along with other factors like lenses and image processing. There’s a reason the Sony A7r V costs $3,898 while the S22 Ultra costs $1,200 and can also run thousands of apps and make phone calls.

Image sensor pixels are squares whose width is measured in millionths of a meter, or microns. A human hair is about 75 microns across. On Samsung’s Isocell HP2, a virtual pixel on a 12-megapixel shot is 2.4 microns across. In 200-megapixel mode, a pixel measures just 0.6 microns. On a Sony A7r V, though, a pixel is 3.8 microns across. That means the Sony can gather two and a half times more light per pixel than a phone with the HP2 Ultra with 12-megapixel binning mode, and 39 times more than in 200-megapixel full-resolution mode — a major improvement in image quality.

Phones are advancing faster than traditional cameras, though, and closing the image quality gap. Computational photography technology like combining multiple frames into one shot and other software processing tricks made possible by powerful phone chips are helping, too. That’s why my colleague and professional photographer Andrew Lanxon can take low-light smartphone photos handheld that would take a tripod with his DSLR. And image sensors in smartphones are getting bigger and bigger to improve quality.

Why is pixel binning popular?
Because miniaturization has made ever-smaller pixels possible. “What has propelled binning is this new trend of submicron pixels,” those less than a micron wide, said Devang Patel, a senior marketing manager at Omnivision, a top image sensor manufacturer. Having lots of those pixels lets phone makers — desperate to make this year’s phone stand out — brag about lots of megapixel ratings and 8K video. Binning lets them make that boast without sacrificing low-light sensitivity.

Can you shoot raw with pixel binning?
That depends on the phone. Photo enthusiasts like the flexibility and image quality of raw photos — the unprocessed image sensor data, packaged as a DNG file. But not all phones expose the raw photo at full resolution. The iPhone 14 Pro does, but the Pixel 7 Pro does not, for example.

The situation is complicated by the fact that raw processing software like Adobe Lightroom expects raw images whose color data comes in a traditional Bayer pattern, not pixel cells grouped into 2×2 or 3×3 patches of the same color.

The Isocell HP2 has a clever trick here, though: it uses AI technology to “remosaic” the 4×4 pixel groups to construct the traditional Bayer pattern color checkerboard. That means it can shoot raw photos at full 200-megapixel resolution, though it remains to be seen whether that will be an option exposed in shipping smartphones.

What are the downsides of pixel binning?
For the same size sensor, 12 real megapixels would perform a bit better than 12 binned megapixels, says Judd Heape, a senior director at Qualcomm, which makes chips for mobile phones. The sensor would likely be less expensive, too. And when you’re shooting at full resolution, more image processing is required, which shortens your battery life.

Indeed, pixel binning’s sensor costs and battery and processing horsepower requirements are reasons it’s an option mostly on higher-end phones.

For high-resolution photos, you’d get better sharpness with a regular Bayer pattern than with a binning sensor using 2×2 or 3×3 groups of same-color pixels. But that isn’t too bad a problem. “With our algorithm, we’re able to recover anywhere from 90% to 95% of the actual Bayer image quality,” Patel said. Comparing the two approaches in side-by-side images, you probably couldn’t tell a difference outside lab test scenes with difficult situations like fine lines.

If you forget to switch your phone to binning mode and then take high-resolution shots in the dark, image quality suffers. Apple automatically uses pixel binning to take lower-resolution shots, sidestepping that risk.

Could regular cameras use pixel binning, too?
Yes, and judging by some full-frame sensor designs from Sony, the top image sensor maker right now, they someday do that.

What’s the future of pixel binning?
Several developments are possible. Very high-resolution sensors with 4×4 pixel binning could spread to more premium phones, and less exotic 2×2 pixel binning will spread to lower-end phones.

Another direction is better HDR, or high dynamic range, photography that captures a better span of bright and dark image data. Small phone sensors struggle to capture a broad dynamic range, which is why companies like Google and Apple combine multiple shots to computationally generate HDR photos.

But pixel binning means new pixel-level flexibility. In a 2×2 group, you could devote two pixels to regular exposure, one to a darker exposure to capture highlights like bright skies, and one to a brighter exposure to capture shadow details.

Indeed, Samsung’s HP2 can divvy up pixel duties this way for HDR imagery.

Omnivision also expects autofocus improvements. With earlier designs, each pixel is capped with its own microlens designed to gather more light. But now a single microlens sometimes spans a 2×2, 3×3, or 4×4 group, too. Each pixel under the same microlens gets a slightly different view of the scene, depending on its position, and the difference lets a digital camera calculate focus distance. That should help your camera keep the photo subjects in sharp focus.

How to Take Those Really Long iPhone Screenshots

More than your average iOS screenshot.

We all know how to take a screenshot on the iPhone. You just push in the volume up and side buttons at the same time and you capture exactly what’s on your screen — nothing more, nothing less. However, those type of screenshots may not be enough if you’re trying to also capture what’s above or below what you can see on the screen.

Hidden within iOS is a scrolling screenshot feature that allows you to snap multiple pages with only a single screenshot. There are third-party apps you can use to stitch together individual screenshots and create a longer one, but a scrolling screenshot makes the process easier.

Read more: 10 iOS 16 Hidden Features That Just Make Your iPhone Better

If you want to save a film script in Safari or a long PDF in your email, here’s what you need to know to take scrolling screenshots on your iPhone.

And if you’re interested in learning about other hidden iOS features, check out this sneaky way to secretly message someone else on iOS and the hidden trackpad that lives in your iOS keyboard.

What is a scrolling screenshot?
A full-page screenshot, or scrolling screenshot, captures an entire page — webpage, document or email — without you having to take multiple screenshots and then stitch them together. For example, if you wanted to screenshot a 116-page document in Safari, you would only have to take a single screenshot to capture the entire thing.

How to take a scrolling screenshot on your iPhone
To take a scrolling screenshot, do the following:

  1. First, take a regular screenshot on your iPhone. If you have Face ID, quickly press the side button + volume up button. With Touch ID, it’s side/top button + home button.
  2. Tap the screenshot preview that appears in the bottom-left corner. It appears for about five seconds, so you must be somewhat swift.
  3. Next, go to the Full Page option. Underneath Full Page, you’ll see a preview of the entire scrolling screenshot on the right side, along with a larger preview in the middle. You also have tools to crop the scrolling screenshot, in case it’s too long.
  4. Once you’re finished editing the scrolling screenshot, hit Done. You’ll see two options: one to save the scrolling screenshot and another to delete it.
  5. Finally, tap Save PDF to Files to save the scrolling screenshot.

You must choose a folder to save the scrolling screenshot in. By default, the Files app will select the last folder you saved something to or the Downloads folder.

How to view scrolling screenshots on your iPhone
All scrolling screenshots are converted to PDFs, so they’re saved to the native Files app. To view your scrolling screenshot, open the Files app, go to the folder in which your screenshot was saved and tap the screenshot.

Here you can rename the file, draw on it, leave comments and more. You can also share the scrolling screenshot, but the other person must have Files or another PDF-reader to view it.

Best Apple Watch Apps: Don’t Bother With Third-Party Options

There are plenty of apps for the Apple Watch, but Apple’s native apps are still among the best.

Apple Watch Series 8 is an iterative upgrade over the Series 7. With each new iteration, the Apple Watch gets more advanced. It’s specifically noteworthy when it comes to tracking your health and fitness. If you want to take advantage of the best Apple Watch apps, we have some pretty straightforward advice: Skip the App Store and stick with the watch’s native apps.

Companies including Amazon, eBay, Target, Slack and TripAdvisor have dropped support for Apple Watch apps, but those services are better-suited for our phones, tablets and laptops anyway. What does matter is the built-in Activity tracker, Messages and Phone apps — the things we want on hand for a quick and convenient glance, regardless of which Apple Watch version we’re currently sporting.

“The watch is really about convenience,” said Ray Wang, principal analyst and founder of Constellation Research. “You’re not going to spend so much screen time on your watch. So I think the secret of building a good Apple Watch app is to think of it as an accessory in addition to something. Very few people use it as a standalone unless it’s for fitness or health or some kind of monitoring.”

Read more: Set Up Your New Apple Watch in Just a Few Taps

When the Apple Watch launched in 2015, it had 3,000 apps available to download. Today, there are more than 20,000 apps — 44 of which are built into the wearable. While watches weren’t an in-demand accessory in general back in 2015, the Apple Watch proved to be a useful tool for checking messages, the weather and reminders, Wang added — all of which are already built into the device.

Here are several native Apple Watch apps that you may not already be using.

  1. Sleep
    The Apple Watch was late to the game when it came to sleep tracking — a crucial wellness feature that rivals like Fitbit have offered for years. While Apple’s Sleep app may not be as comprehensive as the sleep monitoring available on other devices, it’s still a great way to keep track of your slumber and get into a regular bedtime routine. When wearing your Apple Watch overnight, it’ll tell you how much time you’ve spent asleep while in bed as well as your sleeping respiratory rate. That latter feature is a new addition that Apple launched with WatchOS 8 in September.
  2. Wallet
    The Apple Watch is designed to make it so that you don’t have to reach for your phone as often, and the Wallet app is one of the best examples. It allows you to store things like credit cards, boarding passes and movie tickets on your wrist once you’ve added them to the Wallet app on your phone. That means you won’t have to dig into your purse or pocket to make a quick purchase or board your flight. Apple is also expanding what the Wallet app can do in WatchOS 8, which introduces the ability to add home keys and identification cards to your watch.
  3. Messages
    The Messages app is one of the most basic and fundamental Apple Watch apps, but it’s also among the most useful. As the name implies, Messages allows you to read and respond to text messages directly from your wrist. Your phone is still the best tool for sending long text messages, but the Apple Watch can come in handy for sending short, time-sensitive replies when you don’t have a moment to reach for your phone. If you have the Apple Watch Series 7, the latest model, you’ll be able to respond to texts using the device’s new QWERTY keyboard, which is much easier than using the Scribble function.
  4. Noise
    If you have an Apple Watch Series 4 or later, you can use the Noise app to measure the ambient sound in your environment. If the decibel level has risen to a point where your hearing could be affected, the app can notify you with a tap on your wrist.

Read more: Apple Watch Series 7 Review: A Slightly Better Smartwatch Than Last Year’s

  1. Cycle Tracking
    Women can use the Cycle Tracking app to log details about your menstrual cycle, including flow information and symptoms such as headaches or cramps. Using that data, the app can alert you to when it predicts your next period or fertile window is about to start.
  2. ECG
    If you have an Apple Watch Series 4 or later, you have an electrical heart rate sensor that works with the ECG app to take an electrocardiogram (sometimes called an EKG by cardiologists). You’ll also need an iPhone 6S or later, and both the phone and the watch will need to be on the latest version of iOS and WatchOS, respectively. It’s also not available in all regions.
  3. News
    The News app will help you keep up with current events on the fly, showing you stories that it selects based on your interests. However, it’s not available in all areas.
  4. Mindfulness
    The Apple Watch has long offered breathing exercises. But WatchOS 8’s Mindfulness app, which replaced the Breathe app, adds a new option to the Apple Watch’s relaxation repertoire: reflections that prompt you to pause and think about special moments in your life. You’re still able to access Breathe sessions from this app, but the new Reflect option just gives you another way to take a break from your day.
  5. Remote
    If you have an Apple TV, you can use your watch as another remote control — assuming both devices are connected to the same Wi-Fi network. Use the Remote app to swipe around on the watch face and move through the Apple TV menu options, and play or pause shows.
  6. Camera
    You can’t take a picture with your watch itself. But with the Camera app, your watch can act as a remote control for your iPhone’s camera. Use it to help take selfies or start recording on your phone across the room, so you can finally get everyone in that big group shot.
  7. Walkie-Talkie
    The Walkie-Talkie app lets you use your watch as a walkie-talkie to chat with another person wearing an Apple Watch. You press a button to talk, and release it to listen to the reply. The app isn’t available in all regions, and both participants need to have connectivity through a Bluetooth connection to the iPhone, Wi-Fi or cellular. You also have to accept an invitation to connect with someone through the app — they can’t just start talking to you.
  8. Voice Memos
    Like on the iPhone, you can use the Voice Memos app on your Apple Watch to record personal notes and things to remember while on the go. The voice memos you record on the watch will automatically sync to any other iOS devices where you’re signed in with the same Apple ID.

The future of native Apple Watch apps
The collection of native Apple Watch apps is likely far from complete. We saw the addition of the Sleep app and Blood Oxygen app with last year’s respective WatchOS 7 software update and Apple Watch Series 6. And if reports are to be believed, Apple has broader ambitions in the health and wellness space that we could see in the years to come. The company is reportedly working on blood pressure and thermometer tools for the Apple Watch, according to The Wall Street Journal. Apple is also working on a blood-sugar sensor that could help diabetics manage their glucose levels, Bloomberg reported last year, although it says this functionality likely won’t be commercially available for several years.

AI as Lawyer: It’s Starting as a Stunt, but There’s a Real Need

People already have a hard enough time getting help from lawyers. Advocates say AI could change that.

Next month, AI will enter the courtroom, and the US legal system may never be the same.

An artificial intelligence chatbot, technology programmed to respond to questions and hold a conversation, is expected to advise two individuals fighting speeding tickets in courtrooms in undisclosed cities. The two will wear a wireless headphone, which will relay what the judge says to the chatbot being run by DoNotPay, a company that typically helps people fight traffic tickets through the mail. The headphone will then play the chatbot’s suggested responses to the judge’s questions, which the individuals can then choose to repeat in court.

It’s a stunt. But it also has the potential to change how people interact with the law, and to bring many more changes over time. DoNotPay CEO Josh Browder says expensive legal fees have historically kept people from hiring traditional lawyers to fight for them in traffic court, which typically involves fines that can reach into the hundreds of dollars.

So, his team wondered whether an AI chatbot, trained to understand and argue the law, could intervene.

“Most people can’t afford legal representation,” Browder said in an interview. Using the AI in a real court situation “will be a proof of concept for courts to allow technology in the courtroom.”

Regardless of whether Browder is successful — he says he will be — his company’s actions mark the first of what are likely to be many more efforts to bring AI further into our daily lives.

Modern life is already filled with the technology. Some people wake up to a song chosen by AI-powered alarms. Their news feed is often curated by a computer program, too, one that’s taught to pick items they’ll find most interesting or that they’ll be most likely to comment on and share via social media. AI chooses what photos to show us on our phones, it asks us if it should add a meeting to our calendars based on emails we receive, and it reminds us to text a birthday greeting to our loved ones.

But advocates say AI’s ability to sort information, spot patterns and quickly pull up data means that in a short time, it could become a “copilot” for our daily lives. Already, coders on Microsoft-owned GitHub are using AI to help them create apps and solve technical problems. Social media managers are relying on AI to help determine the best time to post a new item. Even we here at CNET are experimenting with whether AI can help write explainer-type stories about the ever-changing world of finance.

So, it can seem like only a matter of time before AI finds its way into research-heavy industries like the law as well. And considering that 80% of low-income Americans don’t have access to legal help, while 40% to 60% of the middle class still struggle to get such assistance, there’s clearly demand. AI could help meet that need, but lawyers shouldn’t feel like new technology is going to take business away from them, says Andrew Perlman, dean of the law school at Suffolk University. It’s simply a matter of scale.

“There is no way that the legal profession is going to be able to deliver all of the legal services that people need,” Perlman said.

Turning to AI
DoNotPay began its latest AI experiment back in 2021 when businesses were given early access to GPT-3, the same AI tool used by the startup OpenAI to create ChatGPT, which went viral for its ability to answer questions, write essays and even create new computer programs. In December, Browder pitched his idea via a tweet: have someone wear an Apple AirPod into traffic court so that the AI could hear what’s happening through the microphone and feed responses through the earbud.

Aside from people jeering him for the stunt, Browder knew he’d have other challenges. Many states and districts limit legal advisors to those who are licensed to practice law, a clear hurdle that UC Irvine School of Law professor Emily Taylor Poppe said may cause trouble for DoNotPay’s AI.

“Because the AI would be providing information in real time, and because it would involve applying relevant law to specific facts, it is hard to see how it could avoid being seen as the provision of legal advice,” Poppe said. Essentially, the AI would be legally considered a lawyer acting without a law license.

AI tools raise privacy concerns too. The computer program technically needs to record audio to interpret what it hears, a move that’s not allowed in many courts. Lawyers are also expected to follow ethics rules that forbid them from sharing confidential information about clients. Can a chatbot, designed to share information, follow the same protocols?

Perlman says many of these concerns can be answered if these tools are created with care. If successful, he argues, these technologies could also help with the mountains of paperwork lawyers encounter on a daily basis.

Ultimately, he argues, chatbots may turn out to be as helpful as Google and other research tools are today, saving lawyers from having to physically wade through law libraries to find information stored on bookshelves.

“Lawyers trying to deliver legal services without technology are going to be inadequate and insufficient to meeting the public’s legal needs,” Perlman said. Ultimately, he believes, AI can do more good than harm.

The two cases DoNotPay participates in will likely impact much of that conversation. Browder declined to say where the proceedings will take place, citing safety concerns.

Neither DoNotPay nor the defendants plan to inform the judges or anyone in court that an AI is being used or that audio is being recorded, a fact that raises ethics concerns. This in itself resulted in pushback on Twitter when Browder asked for traffic ticket volunteers in December. But Browder says the courts that DoNotPay chose are likely to be more lenient if they find out.

The future of law
After these traffic ticket fights, DoNotPay plans to create a video presentation designed to advocate in favor of the technology, ultimately with the goal of changing law and policy to allow AI in courtrooms.

States and legal organizations, meanwhile, are already debating these questions. In 2020, a California task force dedicated to exploring ways to expand access to legal services recommended allowing select unlicensed practitioners to represent clients, among other reforms. The American Bar Association told judges using AI tools to be mindful of biases instilled in the tools themselves. UNESCO, the international organization dedicated to preserving culture, has a free online course covering the basics of what AI can offer legal systems.

For his part, Browder says AI chatbots will become so popular in the next couple of years that the courts will have no choice but to allow them anyway. Perhaps AI tools will have a seat at the table, rather than having to whisper in our ears.

“Six months ago, you couldn’t even imagine that an AI could respond in these detailed ways,” Browder said. “No one has imagined, in any law, what this could be like in real life.”

New Apple Music, TV and Devices Apps Now Available on Windows

Microsoft is making it easier to use certain Apple services on Windows.

Last year, during its Oct. 12 Surface event, Microsoft announced that Apple Music and Apple TV would soon be coming to the Microsoft Store, as replacements for Windows alternatives that just weren’t up to par — and that day is now here.

Apple Music, Apple TV and a third app known as Apple Devices (which lets you manage your Apple devices) are now available for you to download, as long as you’re running Windows 11. We’ll briefly discuss what each of these new Apple applications can do for you on Windows, and how you can install them right now.

If you want to learn more about Windows 11, check out the best Windows 11 features and the upgraded Windows 11 features we love the best.

How to download Apple Music, Apple TV and Apple Devices for Windows 11
As long as you’re running Windows 11 version 22621.0 or higher, you can download any of the three apps to your computer. If you’re still running Windows 10 or something older, check out our guide on how to download and install Windows 11.

Now all you have to do is either click the links below or manually search for the apps in the Microsoft Store:

Apple Music (replacement for iTunes): Stream music, listen to podcasts and more, from the Apple Music service. Must be a paid subscriber.
Apple TV (replacement for Apple TV web player): Watch Apple TV Plus, movies and more. You must be a paid subscriber as well.
Apple Devices (replacement for iTunes): Manage your Apple devices, including your iPhone, iPad, iPod and iPod Touch. You can sync music, movies and TV shows, as well as update, back up and restore your devices.

If you download Apple Music, Apple TV or Apple Devices (or all three), you’ll no longer be able to use iTunes. The only way to get iTunes back up and running is to uninstall whichever of the three apps you downloaded.

Also, all three Apple apps on Windows are currently previews, which means that not all features may work as expected.

My Favorite Hidden iPhone Shortcut To Turn On The Flashlight (And More)

This simple pro iPhone tip will save you time and fumbling.

My iPhone’s flashlight isn’t just a tool I casually fire up if something accidentally rolls under the couch, it’s a feature I use daily to light up the way to the bathroom in the middle of the night, scan my backyard when animals make weird sounds and… OK, yeah, find something I’ve lost under my couch. And since I use the iPhone flashlight so often, I’ve turned on a tool deep in the iOS settings menu that makes it faster to light up the torch — no more fumbling with the lock screen for the flashlight icon or unlocking the phone first.

I don’t exaggerate when I say this hidden iPhone feature has changed the flashlight for me.

Back Tap for the iPhone is an accessibility feature that Apple introduced with iOS 14,. It lets you quickly perform certain actions — say, taking a screenshot or launching your camera — by simply tapping the back of your phone. Essentially, it turns the entire back of your iPhone into a button.

This is an important benefit for all kinds of people, and for me, enabling Back Tap has let me turn it into a customizable button to quickly trigger the iPhone flashlight. I’ll tell you exactly how to set it up for yourself, and you can of course customize Back Tap to trigger other actions.

Also, if you want to learn more about other iPhone and iOS features, check out these 10 next-level iOS 16 features and how to find the “secret” iPhone trackpad.

How to set up Back Tap on iPhone
Whether you want to link Back Tap with your flashlight, camera or launch a different iPhone app, the path through your iPhone settings begins the same way.

On your compatible iPhone (iPhone 8 or later), launch the Settings application and go to Accessibility > Touch > Back Tap. Now you have the option to launch your action (in this case, your flashlight) with either two or three taps. Although two taps is obviously faster, I would suggest three taps because if you fidget with your phone, it’s easy to accidentally trigger the accessibility feature.

Once you choose a tap option, select the Flashlight option — or a different action if you prefer. You’ll see over 30 options to choose from, including system options like Siri or taking a screenshot, to accessibility-specific functions like opening a magnifier or turning on real-time live captions. You can also set up Back Tap to open the Control Center, go back home, mute your audio, turn the volume up and down and run any shortcuts you’ve downloaded or created.

You’ll know you’ve successfully selected your choice when a blue checkmark appears to the right of the action. You could actually set up two shortcuts this way — one that’s triggered by two taps and one that’s triggered by three taps to the iPhone’s back cover.

Once you exit the Settings application, you can try out the newly enabled Back Tap feature by tapping the back of your iPhone — in my case, to turn on the flashlight. To turn off the flashlight, you can tap on the back of your iPhone as well, but you can also just turn it off from your lock screen if that’s easier.

For more great iPhone tips, here’s how to keep your iPhone screen from dimming all the time and cancelling all those subscriptions you don’t want or need.

Apple Reportedly Plans to Use Own Screens on Mobile Devices

The push would reflect the company’s effort to be less reliant on other companies.

Apple plans to begin its own custom displays on mobile devices starting in 2024, Bloomberg reported Tuesday.

The push, intended to bring more production in-house, is expected to begin with the Apple Watch by the end of the year, according to the report, which cited people with knowledge of the matter. The displays will also appear on other devices such as the iPhone, according to the report.

Apple’s display endeavor would dovetail with the company’s efforts to make it less reliant on components provided by third parties, in this case, Samsung, which is also a key competitor in the phone market.

This isn’t the first time Apple has gone about developing its own components to reduce costs. The iPhone maker has spent years making its own 5G modem after it purchased the business from Intel in 2019 for $1 billion in order to not rely on chips made by Qualcomm.

Apple didn’t immediately respond to a request for comment.

Apple Reportedly Working on Own Bluetooth, Wi-Fi Chip

The iPhone maker is also working on its own 5G chip that might be in phones in 2024.

Apple will make its own Bluetooth/Wi-Fi chip for its iPhones to replace third-party components, according to a Bloomberg report Monday.

Currently, iPhones include chips from Broadcom to handle Bluetooth and Wi-Fi functions, and by making its own component, Apple could save itself some money. The company could start including the new chip in its phones by 2025, Bloomberg reported.

This is not the first time Apple has gone about developing its own components to reduce costs. The iPhone maker has spent years making its own 5G modem after it purchased the business from Intel in 2019 for $1 billion in order to not rely on chips made by Qualcomm. Following some delays, Apple’s 5G chip could make its way into iPhones starting in late 2024 or early 2025 instead of later this year, according to the Bloomberg report.

Apple and Broadcom didn’t immediately respond to a request for comment.

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Is its name Reality Pro? Is the software called xrOS?
The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called “xrOS,” but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name “Reality” showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that “Reality OS” could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?
The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s “goal is to replace the ‌iPhone‌ with AR in 10 years,” Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023
New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end
There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device
Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?
The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.