Most notably, it would only ask that companies and individuals make "reasonable efforts" to offer technical help when law enforcement wants access to encrypted data. That could be an out for companies designing encryption that they can't break themselves. The proposals would also limit the scope of demands to whoever is responsible for controlling the encryption process (providers would be off the hook), and would no longer require access in cases of spying, terrorism, or critical infrastructure like electrical grids.
This still leaves significant problems, though. As Just Security asks, how do you define who controls the encryption process? Does this exempt certain companies, or put them on the hook the moment law enforcement thrusts a device into their hands? The limitations on surveillance and terrorism may also be more a matter of ceding jurisdiction to other Congressional committees than a privacy-minded gesture.
Moreover, the infrastructure limitations may only serve to highlight the problems with mandating backdoors and other guarantees of access to encrypted systems. If you're worried that weakening encryption would put power plants at risk, why is it okay to put the broader public at risk? And wouldn't weakening security for one hurt the other? There's no known timetable for when a revised bill would show up (after the election seems more likely), and there's no guarantee that it'll keep these exact changes. However, it's clear that Senators Dianne Feinstein and Richard Burr still don't have a firm grasp on how encryption works. It's virtually impossible to create security holes that only law enforcement can climb through, and a toned-down proposal won't change that.