When Apple Might Release iOS 17 For Your iPhone

The company is tight-lipped, but past releases show a pattern.

Apple announced iOS 17 at its June keynote event, and the next iteration of the operating system is expected to include new features like Standby mode and improvements to apps like Messages when it’s released to the general public. But so far the tech giant has only said the update will be released in the fall, which isn’t very helpful.

If we look at past Apple events in September and earlier iOS releases though, a pattern emerges to show we should expect Apple to release iOS 17 a few days after this year’s September event.

I reached out to Apple about when it planned on holding its September event and when it was planning to release iOS 17 and the company didn’t respond. But here’s when I think Apple will release iOS 17 based on what I see in my crystal ball.

When is Apple’s September event?
Well, it’s… in September. That’s as much as we know right now. Apple is pretty good about keeping secrets, and it hasn’t given a specific date for the event yet. But Bloomberg’s tech reporter Mark Gurman expects the event to be held on either Sept. 12 or 13, and I trust Gurman.

What makes you think iOS 17 will come out shortly after the event?
Apple might be secretive, but it’s also a little predictable.

For more than a decade, Apple has consistently held events every June and September. The company might have other special events in March (see 2022) or October (see 2021) but its June and September events happen every year like clockwork.

And after most of Apple’s September events, the company released the next iOS within the following week. In 2018, for example, Apple’s event was held on Sept. 12 (sound familiar?) and it released iOS 12 five days later on Sept. 17.

There have only been three years over the past decade where it took longer than a week for Apple to release the next iOS version — 2019, 2014 and 2013. Even in those instances, Apple still released the next iOS version eight or nine days after its September event.

The number of days between Apple’s September event and the release of the next iOS version has also trended downward over the past decade. In 2013, Apple released iOS 7 on Sept. 18, eight days after its Sept. 10 event, and last year Apple released iOS 16 on Sept. 12, five days after the company’s Sept. 7 event.

So my guess is Apple will release iOS 17 either five or six days after its September event this year.

What days of the week were past iOS versions released on?
All the iOS version updates were released during the week over the past decade. Apple released nine of the past 10 iOS versions between Monday and Wednesday and one iOS version on a Thursday. Apple hasn’t released an iOS version on Friday, Saturday or Sunday over the past decade.

Wednesdays and Mondays are popular days for Apple to release iOS versions, with four iOS versions being released on a Wednesday and three versions on a Monday. Apple also released two iOS versions on Tuesday. So it’s more likely that Apple will release iOS 17 on a Monday or Wednesday.

Look, just tell me when iOS 17 will be released
If Gurman is right and Apple’s event is held on either Sept. 12 or 13, and Apple releases iOS 17 five or six days later, we get a timeframe of Sept. 17-19.

However, Apple historically doesn’t release iOS updates during the weekend and Sept. 17 is a Sunday this year, so we can rule that day out. That gives us a potential iOS 17 release date of Sept. 18 or 19.

And since Apple has released past iOS versions on Monday more often than Tuesday, I’m predicting Apple will release iOS 17 on Monday, Sept. 18. You heard it here first.

Apple’s 15-Inch MacBook Air Is Superb for Pro Photographers

As a professional photographer and video producer, I’ve been really impressed testing the new 15-inch Air.

Apple’s ultrathin MacBook Air range of laptops were originally designed mostly for lightweight office or school work on the go. But as Apple’s own M-series processors have become more capable, so too have the MacBooks they’re in. I’m a professional photographer and video producer, and I’ve spent the past couple of months with the latest 15-inch M2-equipped MacBook Air, testing it out on photo shoots, both in my studio and on location. I’m impressed, and here’s why.

For me, its 15-inch display is the perfect balance between size and portability. It’s big enough to comfortably edit photos in Adobe Lightroom and Photoshop without needing a bigger monitor. It offers plenty of screen space for all my tools, as well as providing ample room for video timelines when working with video in Premiere, away from my desk.

But the narrow display bezel and the laptop’s 11.5mm thickness means it’ll still slip into the laptop pouch of even my smaller camera backpacks. Sure, it isn’t the lightest Air ever made, but its 1.5kg weight is an acceptable addition for my spine to cope with when I’m out on shoots. Though it feels every bit as well put together as most of Apple’s gear, I do worry that the nice, deep, dusky blue of my test model could easily get scratched and scuffed over months or years of photoshoots. Still, I managed to avoid any major blemishes during my time with it.

With Apple’s M2 chip and 16GB of RAM, my test model was extremely capable for most uses. It handled all my photo editing in Lightroom and Lightroom Classic, importing 1,000-plus batches of raw photos from my CFExpress cards extremely quickly and showing zero slowdown as I navigated the library and batch-applied editing presets. For my work on location, this has been a real treat, being able to quickly back up my files after a shoot before I hit the road.

But I increasingly work in video and still imagery, both for CNET and on my own YouTube channel, and I found the Air capable of blitzing my edits in Premiere and DaVinci Resolve. I edited multiple 4K videos for my channel on this Air and imported my test Premiere project that utilized high bitrate, 4K C-Log video files, with effects and stabilization applied to all clips. I could scrub through and play back the project at full resolution without any issues, and export the final 4K file in a little over nine minutes — not bad, considering that the M2 Pro-equipped Mac Mini I tested recently did it in a little over seven.

Overall I found it to be an extremely swift machine, easily handling any of my professional editing needs for photos and 4K video production. Then there’s the battery life, which I found to be solid, barely dropping while importing and culling photos and generally allowing me to work on it without even needing to consider whether I’d get through my edits before needing to find a plug. And I haven’t even mentioned that it does all this with no fans whirring while I work.

So, it’s perfect? Well, no, and let’s address the elephant in the room: price. The 15-inch MacBook Air starts at $1,299, but that’s with only 8GB of RAM and 256GB of storage, and if you’re using it for photo and video editing, you’ll blow through that in no time (the cards I use in my camera are 2TB). Opting for more RAM and storage always means a big jump in price for Apple devices. My choice, if I were buying one, would be the model with the max 24GB of RAM and 1TB storage, but that brings the cost just north of $2,000. My review model comes in at $1,499.

But that’s still cheaper than the base 14-inch MacBook Pro and a lot cheaper than the base 16-inch Pro, so I do think the Air represents relatively decent value here. Would I choose a Pro model instead? If money were no object, then sure, I’d maybe go for the 14-inch Pro with M2 Max chip, 64GB RAM and 2TB storage, and I’d laugh about the $4,000 price tag while I drove off into the sunset in my new Lambo.

But here in the real world, I’d need to think about what I actually need and how much that’s worth. I work out and about a lot, frequently editing in cars, trains, cafes or maybe just from my bed when I can’t be bothered getting up and sitting at my desk. Traveling light is crucial to my workflow. The 15-inch Air is ideal for this, and it’s more than powerful enough to plow through my photo and video edits without breaking a sweat. And against the lofty prices of the Pro line, even the $2,000 for the 24GB, 1TB configuration I’d recommend seems reasonable.

Oh, and sure, the latest Pro models have SD card slots to appease creatives. But those slots returned just as I — and no doubt many other pro shooters — upgraded all SD cards to faster CFExpress cards. Having that slot would simply be a redundant hole I’d rarely need, and I’d still need to carry a dongle.

For power video creators, editing feature films with huge numbers of 8K video tracks, effects and whatnot, then no, the MacBook Air isn’t going to cut it, at least not for full-time use. But professional photographers like myself who work largely in stills, with a bit of video production on the side — perhaps wedding photographers or event shooters who also want to offer video packages to their clients — will be well catered for with the 15-inch MacBook Air.

Lab Tour: See How Samsung Puts Its Galaxy Phones Through the Wringer

While touring Samsung’s quality assurance facility, we saw phones get dropped, dunked and tumbled.

Inside a nondescript building on Samsung’s campus in Suwon, South Korea, workers sit in cubicles typing away at their desks. Leaf-shaped awnings hang overhead to provide some shade from the bright office lights. But once you leave the main office area and turn a corner, the workplace starts to look a lot different.Tucked away in hallways throughout the floor are Samsung’s reliability testing labs. It’s where Samsung tortures its phones before shipping them out to the millions of people buying them around the world. Robots drop phones on metal surfaces, water jets pummel mobile devices from all directions, and chambers immerse gadgets in extreme temperatures.

It’s all part of Samsung’s process to ensure its phones can withstand drops, dunks, extreme climates and other hardships. This type of testing is critical because phones are expensive and essential to daily life, so they must be built to last.

Samsung is no stranger to reliability issues. It infamously recalled the Galaxy Note 7 in 2016 because of overheating issues and delayed its first foldable phone launch in 2019 after some reviewers experienced broken screens. Last year, reports of batteries swelling inside Samsung phones surfaced on YouTube.

CNET and other journalists got a rare glimpse inside Samsung’s lab to see how it stress tests devices. The tour provided a sense of how Samsung thinks about durability when it comes to phones — not just the tests themselves but also the types of scenarios Samsung accounts for. That’s important because Samsung ships more smartphones than any other company worldwide, according to the latest figures from market research firm International Data Corporation.

When thinking about whether your phone is durable, the first thought that likely comes to mind is whether it can survive drops. According to a 2020 study from AllState Protection Plans, 140 million Americans have damaged their phones at some point in their lives.

And Samsung’s facility has plenty of machines at work to account for all the different ways you might throw, crack or shatter your phone. It’s not as simple as just dropping the device over and over again. The tests in Samsung’s lab are designed to evaluate how phones react when falling at different angles and heights, while also zeroing in on specific parts of the phone, like the screen and cover glass.

One such assessment known as the tumble test involves putting devices in a giant rotating rectangular chamber, where they’re tossed around over and over again. This test is meant to examine how phones hold up after being hit from different angles.

In another corner of Samsung’s facilities, I almost cringed as a steel ball dropped directly onto the screen of a Galaxy Z Fold. For this test, Samsung is measuring how durable the display, back glass and internal parts of a device are after enduring impact.

For less dramatic tumbles, Samsung has a low drop test. A machine suctions up a device and drops it from a lower height, which is meant to simulate what the cumulative effects of flicking your phone across the table could be. Although this test doesn’t look as brutal as the others, it shows that Samsung is thinking about how minor slip-ups could impact a phone’s condition over the long term.

Foldable phones require their own special testing. I watched as robotic arms opened and closed Galaxy Z Flips and Folds over and over again almost in rhythm. When asked how many open-and-close cycles these devices must go through to pass the test, the member of Samsung’s reliability team giving the tour simply said “a lot of times.”

The goal is to understand how many times the average person would fold and unfold their device in a given period during regular use, the Samsung employee said through a translator. Such durability tests are particularly important for foldable phones. A quick Google search turns up dozens of stories of broken screens on the Z Flip across Reddit, YouTube and Samsung’s community forums. Samsung says the Galaxy Z Fold and Z Flip are tested to outlast 200,000 folds, or roughly five years of use if unfolded 100 times per day.

After a decade of reviewing tech products, I’ve become accustomed to deciphering ingress protection ratings, or the scores that tell you whether your phone is water or dust resistant. What I haven’t seen, however, is exactly what the testing behind those ratings looks like in real life.

Samsung’s reliability lab has various machines to test for different levels of water resistance. I watched as water jets soaked a Samsung phone as part of the company’s testing for the IPX4 rating, which ensures that phones can endure being sprayed with water. In the same room, a phone sat at the bottom of a towering water tank to test for more demanding ratings like IPX8.

Resistance to drops and dips in the pool may be the first scenarios that come to mind when you consider whether a smartphone is durable. But Samsung has dozens of other tests meant to assess more granular aspects of your device as well. Robots plug in chargers and press the side key continuously to test a phone’s USB port and buttons. Machines drag the S Pen across the Z Fold’s display over and over again to make sure pressure from the stylus doesn’t impact durability.

But Samsung’s testing lab isn’t just an automated torture chamber for phones. The company also has dedicated spaces for testing how phones hold up in real-world scenarios. One such area, for example, is meant to mimic cafes, park benches and restaurants to see how a device’s cameras perform in those environments.

What stood out to me the most is the way Samsung tests whether devices can withstand extreme temperatures. The company has dedicated chambers for immersing devices in extreme cold and highly humid conditions to ensure phones will function properly in climates around the world. The environmental test chamber looks unassuming metal and glass box from the outside, but the stark temperature shift is immediately jarring once you reach inside. One chamber I observed was set to minus 20 degrees Celsius (minus 4 degrees Fahrenheit), although Samsung conducts tests at other lower temperatures too.

Phones We Still Expect to See in 2023: iPhone 15, Google Pixel 8

Apple and Google usually save their big product announcements for the fall.

We’ve already seen plenty of smartphone launches from Samsung, Motorola, OnePlus and Google this year. But there’s still likely more to come in the fall.

Though it’s hard to predict exactly what’s in store for the phone industry, it’s possible to make some educated guesses, since many companies stick to the same launch routine year to year. The iPhone 15 lineup, for example, is expected to arrive in September, possibly with USB-C charging for the first time. Google’s rumored Pixel 8 could launch in the fall, likely with a new Tensor processor.

Such launches would follow the subtle but important progress that phones made in 2022. The iPhone 14, for example, gained satellite connectivity for emergencies and car-crash detection, while Google found more ways to make use of its custom Tensor chip in the Pixel 7 and 7 Pro. Samsung, meanwhile, gave its flagship Galaxy S lineup a fresh look and an upgraded camera last year, while this year’s S23 is a modest step forward.

Here are the new phones we’re expecting to see in the fall, based on previous launch cycles, rumors and reports.

Apple iPhone 15 lineup

What we’re expecting: Apple’s new iPhone family usually launches in September, and we have no reason to believe 2023 will be any different. The adoption of USB-C charging is one of the biggest changes we’re expecting to see on Apple’s next-generation iPhones.

The European Union recently mandated that all new phones sold in the region must support USB-C charging by 2024. Apple said it would comply with these rules but did not specify whether that means we’ll see a shift to USB-C starting in 2023. It’s also not confirmed if a USB-C iPhone would get a global release, or if it would remain solely a European model.

Otherwise, we’re expecting to see the Dynamic Island arrive on the regular iPhone 15, according to Ross Young, CEO of Display Supply Chain Consultants, and Bloomberg. Apple may also minimize the borders on the iPhone 15 Pro’s display by using a technology called low-injection pressure over-molding, Bloomberg also reports.

We’re also expecting to see a new periscope camera with better optical zoom for the iPhone 15 Pro Max and solid-state buttons for both Pro phones, according to TF International Securities analyst Ming-Chi Kuo. The analyst also believes Apple may introduce more features that distinguish the Pro Max from the smaller-sized iPhone 15 Pro.

Why I’m excited about it: The iPhone’s long-anticipated transition to USB-C is arguably the biggest reason to get excited about Apple’s next smartphone. The switch means iPhone users will finally be able to charge their iPhone, iPad and Mac with the same type of charging cable, reducing friction and making the iPhone that much more convenient. I’m also looking forward to seeing whether Apple further distinguishes the iPhone 15 Pro Max from the iPhone 15 Pro. I’ve argued that Apple needs to give its supersized iPhones more functionality that takes advantage of their larger screens, similar to the iPad.

OnePlus foldable phone

What we’re expecting: OnePlus plans to launch its first foldable phone in the second half of 2023, the company said before its event at Mobile World Congress in February. We don’t know much else. The company has yet to announce any details about the device itself, precisely when it plans to launch the phone, or how much it could cost.

But some leaks have painted a picture of what we might expect. Prominent leaker Steve Hemmerstoffer (better known as OnLeaks) shared what are said to be details about the phone with blog MySmartPrice. The leak suggests OnePlus’ foldable will have a 7.8-inch internal screen, making it larger than the Galaxy Z Fold 5’s, and a triple-lens camera with 48-megapixel wide and ultrawide cameras and a 64-megapixel telephoto lens. It’s also expected to run on Qualcomm’s flagship Snapdragon 8 Gen 2 processor.

Why we’re excited about it: Like Google, OnePlus has a reputation for beating Samsung and Apple on price. That makes me hopeful that OnePlus’ foldable phone will be significantly less expensive than the $1,800 Galaxy Z Fold 5. The Oppo Find N2 from OnePlus’ sister brand has also been well received, with CNET’s Sareena Dayaram calling it the lightest foldable she’s ever carried, so there’s a chance OnePlus could follow in its sibling’s footsteps. Plus, it’ll be nice to see Samsung face more competition in this space.

Google Pixel 8 lineup

What we’re expecting: Rumors have been swirling about what to expect from the Pixel 8 family, which is expected to arrive this fall. The most significant update we may see is the introduction of a temperature sensor on the Pro model, according to leaker Kuba Wojciechowski, who shared this information with the blog 91mobiles.

German tech blog WinFuture reports that it found references to two unreleased Pixel smartphones in publicly available code. The findings indicate that these two devices are codenamed “Shiba” and “Husky” and that they’re powered by a new processor codenamed “Zuma.” The code also suggests these devices will run on Android 14 and include 12GB of RAM, according to WinFuture.

Prolific leaker Steve Hemmerstoffer also partnered with the tech blogs MySmartPrice and SmartPrix to publish what are said to be renderings of the Pixel 8 and Pixel 8 Pro. Based on these images, the two new phones will have a similar design with softer edges compared to the Pixel 7 and 7 Pro.

Another leaker, Yogesh Brar, also claims the Pixel 8 will include a new Google Tensor chip called the G3, a 50-megapixel main camera and a 12-megapixel ultrawide camera. Based on this leak, the Pixel 8 should arrive in early October starting at $649.

Why I’m excited about it: I’m most interested in the new features Google’s next-generation chip will bring to its future phones. Google’s current Tensor chips have enabled features that seem practical and useful in everyday life, such as Magic Eraser and Face Unblur for improving photo quality and the ability to add speaker labels to transcripts in the Recorder app. That makes me excited about where Google could take things next. It’ll also be interesting to see whether Google brings a temperature sensor to the Pro model, and what potential use cases and features it may have in mind for that.

Overall
It seems like the most dramatic changes to new smartphones in 2023 will arrive on premium devices like foldables and “pro” versions of flagship devices. That makes sense given sales of premium smartphones accounted for more than half of global smartphone revenue in 2022, according to Counterpoint Research. We’ll know more throughout the year as more reports and rumors arrive, and as OnePlus, Apple and Google actually debut their devices.

iPhone 15 Rumored September Launch Date Creeps Closer

Apple is reportedly prepared to launch its next iPhone in mid-September.

Apple appears to be ready to launch the iPhone 15 next month. The company’s next iPhone will go on sale around Sept. 22 after being unveiled at an event planned for either Sept. 12 or Sept. 13, according to a report from Bloomberg.

This would line up with past iPhone launch events, as Apple tends to hold its events on Tuesdays and Wednesdays. The iPhone 14 was announced on Wednesday, Sept. 7 of last year, while the iPhone 13 event was held on Tuesday, Sept. 13 in 2021. iPhones are typically released a week and a half after they are announced, which is generally around the third week of September.

Bloomberg’s Mark Gurman followed up on the earlier report in a tweet on Tuesday saying “signs are increasingly pointing to Sept. 12 as the iPhone 15 event date,” but noted that plans could still change.

The new phone is rumored to feature thinner bezels, a faster processor, an updated camera and USB-C charging. While this could be one of the biggest overall updates the iPhone lineup has seen since the iPhone 12 debuted in 2020, it remains unclear if these changes will be enough to entice customers to upgrade.

Apple is facing strong headwinds as it plans to launch its latest iPhone. The tech giant acknowledged a slump in the US smartphone market during its earnings call last week. Apple said on Thursday that sales of the iPhone 14 fell 2.4% in its fiscal third quarter ended July 31.

Phone makers like Samsung and Motorola — though behind Apple in terms of market share — have made inroads by releasing phones with radically new designs. The Motorola Razr Plus and the Samsung Galaxy Z Flip 5 have brought the classic flip phone aesthetic to modern smartphones, while the Samsung Galaxy Z Fold 5 creates a folding phone/tablet hybrid. Samsung’s folding phones are proving to be popular with consumers, selling almost as well as its nonfolding models.

Apple has yet to release a folding phone and doesn’t appear to have immediate plans to do so. Apple’s phone design has had the same general shape since the iPhone debuted in 2007.

Can AI Help Me Find the Right Running Shoes?

With my first marathon less than 100 days away, I gave artificial intelligence a go in hopes of finding the perfect trainers.

Like a lot of other runners, I obsess over shoes. Compared with other sports, running doesn’t require a lot in terms of equipment, but you can’t cut corners when it comes to your feet.

For me, a good fit and comfort are most important, but I also don’t want shoes that will slow me down. Super-cushioned sneakers might be great if you’re doing a loop around the neighborhood with your friends, or if your job requires you to spend all day on your feet, but not when you’re trying to cut a few minutes off a race time.

That search for the perfect combination has felt like a never-ending quest since I started running a couple years ago. Now, training for my very first marathon, the TCS New York City Marathon on Nov. 5, the stakes are higher than ever. So when I was offered the chance to try out Fleet Feet’s new and improved shoe-fitting software that’s powered by artificial intelligence, I went for it.

But that doesn’t mean I wasn’t skeptical about its capabilities. Up until recently, a lot of consumer-facing AI has been more hype than reality. Meanwhile, I’ve been shopping at Fleet Feet, a national chain of specialty running stores, since shortly after joining my neighborhood running group in March 2022.

Read More:
Why We’re Obsessed With the Mind-Blowing ChatGPT AI Chatbot
Apps Like Strava Can Help You Run Better, but Could Put Your Privacy at Risk
How Runners Can Stay Safe Online and on the Streets
It’s Scary Easy to Use ChatGPT to Write Phishing Emails
For more than a year, the company’s in-house shoe nerds, which Fleet Feet refers to as outfitters, have largely kept my feet happy. They’ve answered all of my nitpicky questions and their recommendations changed as my running needs and goals evolved over time.

How does AI play into that?

In this case, AI provides a way to let store employees quickly compare the specific dimensions of my feet with those of millions of others, along with the designs of the shoes in their inventory, to pick out which ones might fit me the best.

The AI isn’t designed to replace expert employees, it just gives them a better starting point for finding shoes with the correct fit, says Michael McShane, the retail experience manager for the New York store I visited.

“It turns the data into something much more understandable for the consumer,” McShane says. “I’m still here to give you an expert assessment, teach you what the data says and explain why it’s better to come here than going to a kind of generic store.”

Anyone who’s ever set foot, so to speak, in a running store knows there are lots and lots of shoes out there, and everyone’s feet are different. What could feel like a great shoe to one person, could be absolute torture to run in for another.

Getting to know your feet with a 3D scan
Originally rolled out in 2018, Fleet Feet’s Fit Engine software analyzes the shapes of both of a runner’s feet (collected through a 3D scan process called Fit ID) taking precise measurements in four different areas. It looks at not just how long a person’s feet are, but also how high their arches are, how wide their feet are across the toes and how much room they need at their heel.

Plates in the scanner also measure how a person stands and carries their weight. Importantly, the scanner looks at both feet. Runners especially put their feet through a lot of use and abuse, making it likely that their feet will be shaped differently,

Mine were no exception, One of my feet measured more than a half size bigger than the other. I can’t say I was surprised. In addition to ramping my training up to an average of 20 miles a week over the past year, my feet have also suffered through 17 years on the mean streets of New York, two pregnancies and one foot injury that left me with a wonky right big toe.

What was a little surprising was both feet measured bigger than my usual size 9 or 9.5. I’ve always had big feet, especially for a woman that stands just over 5 feet tall, but I’ll admit that it was still a little traumatizing to be trying on shoes a full size larger than that for the first time.

The software’s AI capabilities allow the system to then quickly compare the data from a customer’s scan to all of the shoes in the store’s inventory, as well as the millions of other foot scans in the system. Each shoe is graded as to how its measurements matched up with the customer’s. Color-coded graphics show how each shoe measures up in specific areas.

While store employees have used versions of the software including the AI over the years, Fleet Feet says the latest improvements make it consumer facing for the first time, instead of something that takes place completely behind the scenes. The ultimate goal is to add it to the company’s website to make it easier to find shoes that fit online, something that’s notoriously tricky even for the biggest running shoe enthusiasts.

In addition to telling McShane and me how well a shoe could potentially fit, the software gave me a specific starting size to try on, since sizing can vary depending on shoe brand and model.

And I sure did try on shoes. The AI gave McShane loads of suggestions to start with, but it was up to him to narrow it down for me, taking into account my training needs and preferences. Ultimately, I wanted something cushioned and comfortable enough to get me through a marathon, but still light and agile enough that I wouldn’t feel clunky or weighed down.

I also wanted something new. After a year of almost religiously wearing Hoka Cliftons for everyday runs, they now felt too bulky and slow. I also liked the Brooks Ghost trainers, but more for walking around New York than racing.

And I was more than happy to say goodbye to a pair of Nike Zoom Fly 5 shoes that I bought for the NYC Half Marathon. Their carbon-fiber plates and light construction made them super speedy, but their lack of heel cushioning gave me monster blisters that would explode and bleed. Sure I could have taken them back, but I liked their speed so much I just tapped my feet up every time I wore them to protect against the rubbing.

What I walked away with
I spent well over an hour at Fleet Feet trying all kinds of shoes. Since the AI had pinpointed the appropriate size for each model, the sizes I tried on varied but they all pretty much fit. That in itself was a time saver. The main challenge was figuring out what felt the most comfortable when I took a jog around the store.

A pair of Brooks Glycerin felt cushy, but also a bit clunky. I loved a pair of Diadoras from Italy, but they ran small and the store didn’t have my size, which probably would have been a monster 10.5, in stock. Conversely, a New Balance model I tried seemed too roomy to give me enough support.

For me, it was about finding the right level of cushioning and weight. Per McShane’s advice, I tried my best to ignore colors. When it comes to running shoes, I’m a big fan of bright, fun colors, but looks don’t help with comfort or cut seconds off your mile pace.

After many, many boxes, it came down to the Asics Gel-Cumulus and Mizuno Wave Rider (both $140). Both were light and springy and I took more than one jog around the store in both of them. I also tried them out with a new pair of insoles ($55), which also were fitted to me with the help of the AI.

I’ve never used insoles before, but I was told that they would give me greater support for the kind of double-digit mile training I had ahead of me, improving my endurance and reducing the chance of injury. Socks are also key to preventing dreaded blisters, so I grabbed a pair of my go-to Feetures Elite Ultra Lights ($18).

After much debate, I ended up walking out of the store with the Mizunos. While I’ve had Asics in the past, I’ve never tried Mizunos before. They seemed a bit faster and more tailored to my feet than the Asics were. It also turned out that they were on sale and I ended up getting them for $105.

That’s because there’s a new version rolling out that the store didn’t have in stock yet, so they weren’t in the system for the AI to find. While it was nice to save $35, had I known that I might have gone with the Asics just because they’re more current.

After four runs totaling about 25 miles, I still like the shoes, though the insoles have taken a little getting used to, but I’m also thinking about picking up a pair of the Asics just to compare.

For most people, this use of AI will probably go unnoticed, at least until it’s added to the website. While officially now geared to the consumer, it still seems more of a tool for store employees. Sure, data-crunching AI can be great, but it’s the efforts and expert advice of the outfitters themselves that are going to ensure that I keep coming back to their stores.

After all, the TCS NYC Marathon isn’t until Nov. 5 and I’ve got a long road of many miles and many, many pairs of shoes ahead of me before I reach the starting line.

The iPhone 15 Is Coming Soon: Here’s What to Expect From Apple

The rumor mill is buzzing about a significant upgrade to the iPhone. But don’t expect a foldable device from Apple this year.

Apple is weeks away from its annual fall event, where the next iPhone, which we’re unofficially calling the iPhone 15 series, is set to be unveiled. Although we don’t expect to see a foldable from Apple just yet, the rumor mill is buzzing about the next iPhones departing from their traditional design in a meaningful way.

Some of the big questions people are asking are: Will the iPhone 15 get a USB-C port? Will the iPhone 15 series have slimmer bezels? Will Apple increase iPhone prices in 2023? Will the Pro models receive bigger upgrades?

We won’t know for sure until Apple throws the next iPhone event, which will probably be in September. But here are some of the biggest and most credible rumors to give you an idea of what to expect from the iPhone 15 series.

iPhone 15 design: Hello USB-C, goodbye Lightning
This one has been circulating around the rumor mill for years now, but in 2023 the switch from a Lightning Port to a USB-C port could finally happen. That’s likely driven by pressure from the European Union, which has been pushing for a common charging standard for years. In 2022, the bloc managed to pass legislation requiring Apple to equip its iPhones with USB-C ports by 2024 if it wants to sell them in the EU.

If that happens, the question is whether Apple will switch all iPhone models to USB-C or just those sold in the EU. Apple already modifies iPhone models regionally, as it has done with the iPhone 14: The US version has an electronic SIM, while other variants retain the SIM slot. However, there are good reasons to move all iPhones to USB-C moving forward, according to Avi Greengart, analyst at Techsponential.

“There are larger ecosystem, security, and accessory considerations with the power/data connector, so I think it is more likely that Apple moves all iPhones [globally] to USB-C in the iPhone 16 timeframe to comply with European regulations,” he told CNET in an email.

According to seasoned Bloomberg reporter Mark Gurman, only the iPhone 15 Pro and iPhone 15 Pro Max models will receive USB-C ports this year. Perhaps a complete transition could happen next year with the iPhone 16.

Read more: Your Next iPhone Will Probably Need a Different Charging Cable

iPhone 15 design: Dynamic Island expands to all models
Apple is likely to continue selling four iPhone models with the iPhone 15 lineup. Rumors point to a generally similar design across the board, except that the iPhone 14 Pro’s shape-shifting cutout, known as Dynamic Island, is set to make its way across all models.

That rumor comes from display analyst Ross Young, who also said in a September tweet that he’s not expecting base iPhone 15 models to have a higher refresh rate like Apple’s Pro iPhones because the supply chain can’t support it. Gurman also still expects this to pan out as indicated by the Jun. 30 edition of his Power On newsletter.

Read more: iPhone 14 Pro’s Most Eye-Catching Feature Feels Like It’s Winking at Something Else

iPhone 15 design: Skinnier bezels
According to Gurman, Apple’s expected to use a new manufacturing technology called “low-injection pressure over-molding” to make the Pro iPhones. This is the same method that’s used for some Apple Watch models. It will help Apple reduce the size of the bezels by fractions of a millimeter, which would in turn allow for an ever-so-slight increase in screen real estate.

iPhone 15 design: Easier repairability
The interiors of the iPhone 15 Pro and iPhone 15 Pro Max are also slated for a redesign that will make them easier to repair, according to the latest edition of Gurman’s Power On newsletter. Gurman says the inside parts have been changed to match the ones in the iPhone 14 and iPhone 14 Plus, which received the internal changes last year.

“This is the iPhone 14 reborn as a beautiful butterfly — a midframe in the middle, accessible screen on the left, and removable rear glass on the right,” iFixit wrote in a post last September after the iPhone event.

Interestingly, Apple didn’t discuss this internal redesign in its keynote, but the change was spotted by repair experts at iFixit, who said it was evident that Apple went back to the drawing board to rework the internals and integrate them seamlessly into its iPhones.

iPhone 15: Upgraded ultra wideband
According to noted Apple analyst Ming Chi Kuo, the iPhone 15 will run on an upgraded Ultra Wideband processor, which Apple calls the U1 chip, to better integrate with the company’s new AR headset, the Vision Pro. UWB is a short-range wireless communication standard often used to track down the location of objects. It can pinpoint your Apple AirTag or unlock your car as you walk up to it with your phone. In a recent post on Twitter, Kuo said this is all part of Apple’s broader strategy to “build a more competitive ecosystem for Vision Pro.”

iPhone 15: Wireless charging upgrade
According to a May report by ChargerLab, a power specialist website with a steady track record, all iPhone 15 models will support 15-watt wireless charging using the Qi2 open standard. If this turns out to be true, it’ll mean the iPhone 15 could open up a whole new world of wireless charging devices that can replenish the device at its full speed. Apple had previously limited open wireless charging standards to 7.5W, leaving the full 15W charge speed for Apple MagSafe licensed accessories.

iPhone 15 camera: Periscope-style telephoto lens arrives
Noted Apple observer Ming-Chi Kuo, an analyst with TF International Securities, forecasts that the iPhone 15 Pro Max will receive a periscope-style telephoto lens. This sort of telephoto lens allows for higher optical zoom levels, with Kuo forecasting a 6x optical zoom could arrive in the iPhone 15 Pro Max. The optical zoom on the iPhone 14 Pro Max is limited to 3x, which lags rivals such as the Samsung Galaxy S22 Ultra’s 10x optical zoom. This rumor was recently bolstered by well-known leak source and Twitter user Unknownz21, who stated that the Pro Max model will come with the special lens.

iPhone 15 design: Solid-state buttons come to pro iPhone 15 models
Kuo expects Apple to differentiate further between its base and Pro models in the coming years. One way he’s expecting that to happen is by way of solid-state volume and power buttons of the iPhone 15 Pro models instead of the standard keys present on today’s devices.

The solid-state buttons, which Kuo says will be similar to the home button found on the iPhone SE and iPhone 7, mimic the tactile feel of pressing a button with the help of haptic feedback. The apparent advantage of this type of button is that it also protects against water ingress.

iPhone 15 Power: Increased RAM for pro models
According to Taiwanese research firm TrendForce, Pro models of the iPhone 15 lineup will get a bump up in RAM to 8GB from 6GB to complement the anticipated A17 Bionic chipset. Base models will continue to receive 6GB RAM, according to TrendForce. This rumor is also apparently backed up by a research report from analyst Jeff Pu of Haitong International Securities, according to a MacRumors article, which referenced Pu’s report.

iPhone 15 price: Up, up and away?
Prices have dramatically increased since the original iPhone arrived in 2007. And that may happen again in 2023 with the iPhone 15, except not in the way you might think. The price of the regular iPhone 15 is currently expected to remain the same, according to analysts who previously spoke with CNET.

However, the upper limit of the price range could be pushed higher if rumors about a luxe iPhone 15 Ultra turn out to be true. The rumored Ultra model could potentially replace the iPhone 15 Pro Max next year, Bloomberg’s Mark Gurman wrote in September last year. This falls in line with predictions from Kuo, who expects Apple to differentiate further between the iPhone Pro and iPhone Pro Max models. However, other rumors suggest that the iPhone 15 Ultra will be a step up from the iPhone 15 Pro Max. Gurman made no mention of the iPhone 15 Ultra in his June 30 newsletter.

US prices currently range from $829 for the entry-level iPhone 14 model (128GB) all the way up to $1,599 for the highest-end iPhone 14 Pro Max with 1TB of storage.

Read more: What Apple Could Do With iPhone 15 Prices in 2023

iPhone 15 Ultra camera: Variable zoom
According to tipster Revengus, the iPhone 15 Ultra will feature a telephoto camera with a variable zoom lens, which is the camera setup rumored to feature on Samsung’s Galaxy S24 Ultra. Variable optical zoom (continuous zoom) cameras aren’t commonly found on smartphones for a variety of reasons, including the fact that the size and design of phone cameras restricts the type of lenses that can be used.

iPhone 15: Launch and release timeline
Apple holds its annual iPhone event in September almost every year, so we’d expect the timeline to remain the same for the iPhone 15. New iPhones typically get released shortly thereafter, usually the Friday of the following week. Sometimes Apple will stagger release dates for specific models, especially when introducing a new design or size. So it’s possible that the iPhone 15 lineup will have more than one release date.

Here’s what we know:

Apple tends to hold its events on Tuesdays or Wednesdays. Apple’s iPhone 14 event was held on Wednesday Sept. 7, while its iPhone 13 event was held on Tuesday, Sept. 14.
iPhone release dates are typically a week and a half after Apple’s announcements.
In general, new iPhones are released on a Friday, around the third week of September. For the iPhone 13, preorders began Sept. 17 and the phones went on sale Sept. 24.
Looking for more iPhone advice? Check out our iPhone upgrade guide, our list of the best iPhones and our roundup of the best cases for your iPhone 14 or 14 Pro.

Qualcomm’s Appointment-Only Museum Shows Early Phones, Mobile Technology

Samsung, Google and Qualcomm’s mixed reality platform is still in development and mostly remains a big mystery. But its mobile focus could give it a unique edge.

It’s already been a big year for VR: the PlayStation VR 2 headset added tethered VR gaming to Sony’s PS5; Meta’s upgraded standalone Quest 3 headset arrives this fall; and Apple’s first mixed reality device, also standalone, is coming next year.

Samsung, Google and Qualcomm also have some sort of mixed reality platform in the works. Samsung made a drive-by announcement of it at its winter Unpacked event. It was again mentioned briefly at Google’s I/O developer conference. But we really don’t know much more right now other than these three companies are collaborating on it. Going by the last mention of it at Google I/O, more information is expected by the end of the year. Will Samsung mention more at its next Unpacked event this week? It’s unclear, but even if it is invoked, expect an aura of mystery to continue.

While I’ve wondered about this mystery platform before, after trying Apple’s Vision Pro, I’m more convinced Samsung, Google and Qualcomm may take a different approach. In particular, I expect the focus will be on an area Apple almost completely avoided with the Vision Pro: phones.

Mobile is the missing link
Qualcomm makes its own AR- and VR-optimized mobile processors, which are already in most AR and VR devices right now: the Quest 2, Quest Pro, HoloLens 2, Pico 4, HTC Vive XR Elite and a bunch of others.

Many of these headsets are standalone — in that they aren’t tethered to another device and function independently of phones. The Quest 2, for instance, can bridge with a phone for notifications and some other streaming features, but it doesn’t coexist with Android or iOS.

Qualcomm is working on another bigger initiative in VR and AR, however, and has been chipping away at it for years. Qualcomm’s road map for XR eyewear (extended reality) is more and more pointed toward devices that can bridge with phones and eventually will lean on phone- and cloud-processing power to do the heavy lifting, leading toward glasses that are smaller. There are already some early examples: the NReal Light and Lenovo’s ThinkReality glasses. Meanwhile, there aren’t any devices that use Qualcomm’s smaller AR2 chipset platform, which uses phones to drive wireless AR on glasses that look nearly normal.

Qualcomm is clearly building the chipset future for these devices, while Samsung’s role will likely be on the rest of the hardware design, and particularly displays and cameras.

Finally, there’s Google. Google’s presence in this three-way picture looms large as a key part of the puzzle: the software glue linking existing platforms together.

Google and Samsung worked together on developing Wear OS 3 and its software interfaces, using the Galaxy Watch 4 as a first exploration a year ahead of Google’s Pixel Watch, with a focus on better Android-to-watch connections. Google’s folding-phone and recent tablet efforts are, in a different way, trying to build out Android in a different direction, toward multiple displays and larger display formats and multitasking.

I’d expect the trio’s mixed reality headset efforts are building toward ways to have a headset talk to phones — and maybe even wearables like watches — to create an experience that feels more interconnected to the things we already have in our pockets. If that’s the case, version one of whatever Samsung has in store will feel much different than the Vision Pro.

Qualcomm already has a software bridge between phones and AR glasses called Snapdragon Spaces. I referred to it earlier this year as non-ideal, only because it seems to work as a subset of what already exists on Android and Google Play. Qualcomm’s already playing with ways phones can be used as handheld controllers for these AR glasses. I’m waiting for Google to truly enable a system-wide way Android can work in XR, though, almost like how Apple has moved iOS into Vision Pro. Samsung feels overdue for a VR/AR return too. That could, or should, be what Samsung, Google and Qualcomm make happen in 2024.

Another option: standalone with better links to mobile
Another direction could end up being a standalone device, following a similar path to the Quest 3, Apple’s Vision Pro and HTC’s Vive XR Elite. That space is getting crowded, though. To make a standalone headset, Samsung would either need to introduce dedicated VR/AR controllers (something handheld, or maybe wearable, like a band or ring), explore ways phones and watches could be input devices or go the Apple route, and use hand/eye tracking and voice to interact.

The problem here, as I see it: what’s the app library? Meta has its own custom collection of games and apps built up for years. Samsung leaned on its Oculus relationship for its older Gear VR goggles for its phones, nearly a decade ago.

Apple is already clearly taking an iOS-first approach, filling Vision Pro with compatible iPad apps and building out mixed reality functions for developers over time, while also introducing a lot of new Apple-made mixed reality apps as well. But Apple’s headset, while it runs iPad apps, isn’t directly cross-compatible with iPads or iPhones or Apple Watches to interact with these apps at the same time… at least, as far as we know right now.

I expected Apple to make mobile device-to-Vision Pro interactions a key part of its headset design, but I was wrong. Samsung could use its platform to make the first steps in this territory ahead of Apple, playing with the phone-to-headset tools Qualcomm is already building, and adding in new Google apps and Android compatibility that could be part of the partnership.

Either way, we may not know until 2024
Samsung may keep its plans more of a mystery for now, especially since Apple’s Vision Pro isn’t actually launching until early next year. Still, I’d expect news to emerge ahead of the official launch, either from Samsung, Qualcomm, Google or all three. Apple tends to keep its products a total mystery until they’re unveiled, but Samsung may take a more gradual approach to its news.

These are all guesses, though. All I know is phones are the most interesting part of the future evolution of where AR glasses are aimed, and Samsung’s biggest focus is on phones and the devices that connect with them. That’s also the territory XR devices need the most help with. Samsung, I’m awaiting what comes next. It could help point the way for the next wave of glasses later in the decade.

Facebook Now Blocking News in Canada and Google May Follow: What To Do

In response to the Canadian Online News Act, which aims to compensate publishers, both Meta and Google are instead looking to block news entirely.

Meta, parent company to Facebook, is beginning the process of blocking news in Canada, the company said in a blog post on Tuesday. Google also aims to block links to Canadian journalism later this year for people in Canada in response to a new law that forces technology companies to compensate publishers for linking articles.

“The legislation is based on the incorrect premise that Meta benefits unfairly from news content shared on our platforms, when the reverse is true,” Meta said in a blog post. “News outlets voluntarily share content on Facebook and Instagram to expand their audiences and help their bottom line. In contrast, we know the people using our platforms don’t come to us for news.”

With Australia passing a similar measure in 2021, more countries are looking toward compensatory legislation as news outlets continue to layoff journalists in record numbers while Silicon Valley giants rake in hundreds of billions of dollars in revenue.

While both Google and Meta argue that online platforms helped lift websites with increased traffic and ad revenue, others argue the opposite. Critics, such as the News Media Alliance, say Big Tech used work from journalistic outlets to scrape reporting and data to fill their platform for results and content. It then used that outsized power in directing traffic to abruptly change how many clicks websites get, all the while still using reporting from outlets to fill search results and social media feeds. Google, which also controls a large portion of the online ads market, is in the midst of a Department of Justice antitrust lawsuit with the government alleging the Search giant used its power to block out competition and take profits meant for other advertisers and publishers.

As we face a potential standoff between lawmakers and journalists on one side and the gatekeepers of the internet on the other, here’s what you need to know about Canada’s Online News Act and how it might impact you.

What’s up with Google, Facebook and Canada?
The Online News Act, which goes into effect at the end of 2023, compels Google and Meta to compensate publishers when linking to news content. It’s part of an effort to inject news publishers with an infusion of cash after the internet revolution upended traditional revenue streams for outlets.

Previously, newspapers relied on subscriptions, advertising and classified sections to keep their newsrooms operational. But with the move to information online, subscription revenue dried up as people began searching news for free, and sites like Craigslist and eBay, rather than newspaper classified sections, are used to sell people’s goods.

Between 2008 and 2021, 450 Canadian news outlets have closed, according to Pablo Rodriguez, the minister of Canadian heritage. He says this has led to public mistrust and the rise of disinformation. At the moment, the Canadian Broadcasting Corporation is encouraging Canadians to visit its site directly to catch up on the latest news.

Does this impact people in other countries?
Right now, Google and Facebook’s restrictions will affect only Canadians later this year when the law goes into effect. This means that Americans wanting to read up about news in Canada should still find news results from Canadian publications in Search.

Canada isn’t the first government to push a publisher compensation law. The first was Australia, where in 2021 it passed the News Media Bargaining Code. It’s expected to bring in $130 million annually, with Australia’s Treasury already calling the law a success. Both Google and Meta resisted the Australian law before eventually coming to the negotiating table.

The California state legislature has also advanced a similar law last month requiring Big Tech giants to pay for linking to content, with Meta already threatening to pull news content if the law passes. US senators tried passing a similar law titled the Journalism Competition Preservation Act last year, but it ultimately failed to pass it through Congress. Although, lawmakers resurrected the legislation last month and hope to bring it to the floor for a vote.

How to find news without Google or Facebook
For Canadians wanting to stay up-to-date on news later this year, here are some ways you can still find news.

World news. The new law affects only Canadian publishers, so searching for news topics in Google will still bring you news from non-Canadian publications.
Bing. Microsoft said it will continue serving up news links for Canadians on its search engine Bing. “Microsoft supports a strong and independent news and media ecosystem as an essential ingredient for social cohesion, and a foundation of our democratic systems of government,” Microsoft said in a statement.
Canadian news sites and social media. You can go to Canadian news sites directly and consider setting a Canadian news website like the CBC or Global News as your default home page on a web browser.
Social media accounts. You can also follow those news outlets on social media platforms like Twitter. Meta says it’s still assessing how the Online News Act will impact news links on its newly launched Twitter competitor, Threads. There are also website aggregation sites like Feedly that can give you a Twitter-like feed of all the news publications you follow.
Get a VPN. It should also be possible for Canadians to use a VPN and set their location to the US or another country. This should allow links from Canadian publishers to appear in search and on Facebook. Be sure to check out CNET’s guidance on the best VPN services before subscribing.
Reddit. For Reddit users, subscribing to the r/Canada subreddit is a good way to find the top stories people are discussing. Cities and provinces like r/Toronto and r/BritishColumbia also have their own dedicated Reddit pages.
Support Canadian journalism. Post.news is a new website that lets you redeem points to read local articles. You can follow publications in the same way you do on Twitter, and it’ll bring you a feed of all the latest stories. Signing up gives you 50 free points and each point costs less than a cent to buy. Even though the cost is minuscule, using a few points to read articles pays websites far more than a banner ad on the side of a webpage.
How has Big Tech affected journalism?
The state of journalism is one of many concerns governments around the world have regarding the power of Big Tech. The industry has largely been unregulated, allowing tech giants to expand rapidly around the globe. Regulators are also noticing the closure of newsrooms and continued layoffs. In the US, 2,500 news outlets have closed since 2005.

As the internet’s matured, major tech platforms like Google and Facebook took the lion’s share of traffic online, being the de facto way people sought out information.

Google, in particular, not only controls the window into the internet for billions of people through Search, Chrome and Android but also the advertising marketplace and associated technology, which has attracted its own US Justice Department-led antitrust lawsuit. This gives Google a huge influence in driving traffic, meaning that for a site to succeed, it needs to optimize its content for Google Search. And as Google has floated more ads to the top of Search, including e-commerce links, that’s had an immediate impact on how much money websites can make.

What will the law do for Canadian journalism?
The Canadian law is estimated to bring in $329 million to Canadian newsrooms. By comparison, Google and Meta brought in $285 billion and $117 billion in revenue last year, respectively. Assuming each company had to pay out $329 million, this would only be 0.11% of Google’s 2022 revenue and 0.28% for Meta.

“Big tech would rather spend money changing their platforms to block news from Canadians instead of paying a small share of the billions they make in advertising dollars,” Rodriguez said in a tweet. “Canadians won’t be bullied. Big Tech isn’t bigger than Canada.”

Google has already shown it’s willing to play the long game, however; Google News backed out of Spain for eight years following the passage of a similar publisher compensation law before coming back last year.

Google didn’t respond to a request for comment. Meta said it had nothing further to add.

Google, Microsoft, OpenAI Join Forces to Create AI Safety Forum

ChatGPT maker Open AI , startup Anthropic and tech giants Google and Microsoft have forged an alliance to create a framework for safety standards and the responsible development of what they’re calling “frontier AI” models.

The four tech companies on Wednesday announced the formation of the Frontier Model Forum in a blog post and shared the group’s main areas of focus. The announcement comes less than a week after top executives of those four companies, along with others including Meta and Amazon, met with President Biden and pledged to reduce the dangers that unrestrained artificial intelligence may pose and to abide by AI safety measures which prioritize the public’s security and trust.

The Frontier Model Forum has outlined its blueprint for the coming year with three areas of priority. They include determining best practices for developing and launching AI applications, furthering AI safety research, and having transparent discussions on vulnerability, risks and security with lawmakers, academic institutions and industry peers.

As part of its larger strategy, the group is open to other organizations joining as members if they meet the criteria regarding frontier models, which they defined as “large-scale machine-learning models that exceed the capabilities currently present in the most advanced existing models.” The founding companies plan to assemble an advisory board in the coming months as well.

“Companies creating AI technology have a responsibility to ensure that it is safe, secure, and remains under human control,” said Microsoft’s president and vice chair, Brad Smith, said in a statement. “This initiative is a vital step to bring the tech sector together in advancing AI responsibly and tackling the challenges so that it benefits all of humanity.”

Concerns about AI safety and security risks have prompted calls for multilateral oversight and the establishment of guardrails for consumers and enterprises. Tech firms are being asked to address issues around deepfakes, cybersecurity threats, discrimination and data collection.