- If you’re using PHP <7, you can still use random_bytes()! Well, once you install random_compat via Composer.
- If you’re using any mcrypt function other than mcrypt_create_iv() for working with passwords, stop. Sounds like you’re encrypting passwords rather than hashing them. Don’t do that.
- mcrypt isn’t exactly 1:1 with OpenSSL for encrypt/decrypt, but you can get it to do the same thing (same inputs, same outputs). I’ll do a post later on how to do this…I’ve done exactly that before and have unit tests to prove it 🙂
- You can specify a manual salt in PHP 7.0 within password_hash(). It’s deprecated, and for good reason: don’t use it. But if you need backward compatibility with something or other, it’s there…for now.
- The next password hashing mechanism that’ll show up in PHP, as PASSWORD_DEFAULT for password_hash(), is Argon2i. There’s an RFC for inclusion of the algorithm in the language, but nothing quite yet for setting as PASSWORD_DEFAULT…probably only a matter of time though. Maybe PHP 8.
- Re: MFA, TOTP doesn’t have replay prevention built in, but it cycles every 30 seconds if you’re using Google Authenticator (the spec lets you use other periods, but 30 seconds is the only one GAuth supports). HOTP is the other OATH standardized one time password, and that one is strictly event based. Look up spomky-labs/otphp for building those codes, as well as bacon/bacon-qr-code to spit them out as QR codes that Google Authenticator can consume. Built an internal 2FA server using those libs…eventually it’ll get open-sourced…
- For JWTs, League’s OAuth2 server uses lcobucci/jwt rather than Firebase’s library…lcobucci’s a bit more full-featured.
- Read this about JWTs’ algorithm field, and how you shouldn’t trust it. Then use a library that doesn’t have that vulnerability (both of the above are patched).
As I’m writing this, I’m sitting at SeaTac waiting for my flight back from PNWPHP to board. One talk there inspired me to get AMP up and running on this blog…but more on that in another post. As part of that process, I figured, “What the heck, I’ve got CloudFlare set up on my site, which gives me HTTPS for free. I should force HTTPS for my entire (WordPress) blog. Which means I’ll get HTTP/2 acceleration for free as well (because CloudFlare does that), which Davey Shafik said was pretty awesome.”
My site has been available over HTTPS for a bit, as I set up CloudFlare a few months back, but the default protocol was HTTP, hitting my host directly. No geo-acceleration, no HTTPS, no HTTP/2.
The process to fix that issue was as follows: Read the rest of this entry »
In this post, a polyglot dev (PHP included) says we need an alternative to PHP. By which he means a widely used replacement for PHP for web applications and web sites. The idea being that, if you get a simpler, more consistent, secure web app/site language that has built-in support for fun new technologies and techniques like HTTP/2, WebSockets, unikernels and concurrency/async primitives, they will come. “They” in this case being developers who wouldn’t normally know how to write good code in PHP and will magically do so in a language that makes it easier to do so.
The post concludes with no clear recommendation of an existing language, nor even “X language plus Y features”. And, more importantly perhaps, the post doesn’t tackle how that language would rise to fame; remember that we’re talking about an alternative that can take its place as the lingua franca of server-side programming in a web application context, ostensibly by providing killer apps for both new development and cross-language refactors I’ll come back to this omission in a minute. First, let’s do over the stated objections to PHP as it stands. Read the rest of this entry »
Last weekend I attended SunshinePHP (it was a blast; you should go next year if you didn’t this year…or if you did, for that matter). Friday night, there was a panel on minimum PHP versions, with an eye to raising the bar to something in recent, non-end-of-life history rather than allowing versions that won’t get security fixes anymore. The battle cry there was one of pushing hosts, devs, sysadmins and communities in general to newer versions (5.5, 5.6, and 7 late this year) in the name of better speed, better security, and a much happier environment for developers.
This battle cry was mixed with the explanations of some panel members on why their packages still support PHP 5.2 and 5.3 (remember, both now no longer get security fixes), with remonstrances that increasing a version requirement on CMS-centric frameworks like CodeIgniter, or CMSes themselves like WordPress and PyroCMS would end up stranding user bases on unsupported, vulnerable software if they increased their minimum version requirement to something reasonable, rather than getting those devs and end users on a supported, more dev-friendly version of the runtime. For full-stack frameworks, and given the proliferation of, and ease of migration to, 5.4+ hosts, I find this unconscionable, for reasons stated eloquently by Anthony Ferrara.
But another member of the panel also supports PHP 5.3 with his libraries: Paul M. Jones with the AuraPHP project. Why am I not railing against this…and the fact that the Aura v2 libraries actually downgraded their version requirements relative to Aura v1? Paul mentioned that the effort to allow 5.3 compatibility was quite low (remove short array syntax, remove callable typehints), but there’s a better reason: Aura libs can be used to modernize applications and serve as a bridge to current versions…and you want to put the other end of the bridge where those apps are sitting right now. Read the rest of this entry »
According to this article, Microsoft has switched feet on its foot-shooting escapade that is the XBox One. The short version of the story: Microsoft decided to roll back truly heinous DRM on its games, but in return users are giving up features that make the console a generation ahead of the PS4. Or something like that. Yeah…no.
The reason: if you want to lace a disc with heavy DRM that enables you to use the game in a not-disc way, you’re doing it wrong. As long as folks own physical media (and, like it or not, they own the metal and plastic wafer that the game is printed on to), they’ll have in their mind the concept of ownership. Right of first sale and all that. Which is why Sony’s 22-second “how to share a PS4 game” struck such a chord with folks.
Now let’s look at the downloadable game side of things. The expectation of playability anywhere is there, but is tempered by an expectation of DRM. Anyone who has downloaded a PC game from the likes of Amazon or EA Origin has seen this; you can pull the game however many times, but don’t expect to play it simultaneously on two different machines. Just like with a disc in a drive, it won’t work. Which makes sense…you’ve got to protect those bits somehow.
This begs the question of whether a game should be available in both disc and download formats, each with its own DRM scheme? My answer: absolutely. Build what the customer expects into the disc, and what you think the customer might want into the download. The author mentions that he has a fast, reliable ‘net connection. That’s great; that means you can buy a downloadable game and skip the disc once and for all.
The point of physical media (which can pack 20+ GB of content onto a single Blu-ray disc) at this point is to provide a fast-loading alternative to the slow average connection speed of Internet users at large. For them, downloading entire game is an ordeal, particularly if their connection is capped, throttled or slow all the time (not all of us have Google Fiber, FiOS or even Comcast available). And their friends may be in the same boat, so schlepping a disc from point A to point B isn’t a big deal, but dealing with on-disc DRM is. You don’t want another SimCity, do you?
tl;dr: Customers have spoken, and Microsoft did the right thing by rolling back its physical disc DRM. If you want more features at the expense of DRM, there is a solution: downloadable games (which should be doable with every single game). Locking down physical media isn’t.
My most recent tech purchase over $500 was a computer. Specifically an HP Envy x2. One of the reasons: amazing battery life. Twelve hours or so. The catch: the darned thing pokes along due to an Atom Z2760 CPU. But it’s also $580 so that’s forgivable.
My workhorse notebook is an early 2009 MacBook…with a few upgrades. It’s got the 2GHz Core Duo CPU and nVidia 9400M graphics…backed up with 8GB of RAM and a 256GB Crucial m4 SSD. It’s not the speediest machine out there, and I can’t seem to find a decent replacement battery so I can only get three hours or so away from an outlet, but with the RAM and disk upgrades it’s actually reasonably fun to use.
Why did I just bring up two pieces of old/low-end equipment that have nothing to do with the current MacBook Airs announced a couple hours ago, other than screen size? Because replacing both with a 13-inch Air isn’t out of the question for me…later this year, once the newest OS X edition comes out. That said, there are a few specs that got glossed over during the presentation today, amid all the talk about power efficiency (nine hours on a charge for an 11-inch machine, or twelve hours on a 13-inch, is just excellent). Stuff like CPU speed and upgrade costs.
This is going to be a bit of a rapid-fire, non-exhaustive list, but…
- Having an IDE other than Eclipse for Android dev makes me want to pick up the platform again. JetBrains, the makers of the IntelliJ IDE on which the new Android IDE is based, is a solid outfit (I use one of their other IDEs relatively regularly).
- I’m not buying a Galaxy S4 “Nexus Edition”. My S III is just fine, and the S4, in addition to being expensive, has the same problem that the Nexus 4 has: I can’t get 4G where I need it because Sprint is the only carrier that can do that.
- I should have gone to I/O. I wouldn’t pay full rack rate for the S4 Developer Edition or the Chromebook Pixel (though I’ve thought about the latter), but I would certainly use the heck out of said devices if they were included in the price of admission.
- Watch out, PayPal. Google isn’t the first to do person to person money transfers, but if you’ve got a Google Play account and Google has opened up the new “attach money” feature to you, the amount of effort required to send money to someone else is ridiculously low.
- The new Hangouts isn’t the first time Google has done photo sharing through chat (and the makers of Hello did a really good job with the app, speaking from personal experience). It’s been awhile though.
- Speaking of Hangouts, the fact that the service has been pushed in the direction of a persistent chat room with video calling et al as a situational add-on is…well…the way it should be.
- Per-minute billing (with either one-hour or ten-minute minimums) on Google’s IaaS compute offering is really cool. Nice to see Amazon one-upped at their own game, at least in this small way, and I’m sure that this will make sites that see serious traffic spikes for smallish periods take note of Google’s offering. Until its competitors implement the same thing, of course.
- The new Maps looks epic. If only I could actually use it.
- I want a H.264 (AVCHD) -> VP9 encoder (CLI is fine…integration into Handbrake is a nice bonus) yesterday. Or a whatever -> VP9 encoder, for that matter. I also want to know how VP9 compares to H.265 (is it inferior like VP8 is compared to H.264, or is it pretty comparable?)
- I, for one, welcome our new voice search enabled, auto-image-enhancing, auto-hash-tagging overlords. The competition is a click away, but they just aren’t up to snuff compared to Google in so many of these areas.