I'm sorry, but I simply could not resist.
It's just... Dramatic...
This is a very quick post regarding modern PR practices and how every single medium to large-scale video game and entertainment media website cannot be trusted never-ever-never. Ever.
Disclaimer: I don't play sports games, especially soccer games. I just don't care about football. This post is exclusively about dirty PR practices employed by commercial companies and PR agencies they work with.
On September, 25-th EA Games released a new title in their well-known sports games series named "FIFA 19". I won't go into details what a... Controversial product it ended up being. If you're interested, please refer to the following videos by YongYea: one, two, three.
I'm here to talk about how the game scored at one of the better known review sites out there — Metacritic.
Here's the FIFA 19 for PS4 page at Metacritic in its current state (Oct, 5-th):
Note the sharp disparity between "Critic" and "User" scores for the product.
EA is a big Commercial Entity and its CEO knows the company needs to maintain good sales of its products. The most common way to do so when it comes to review websites and social networks have always been... paid reviews. It doesn't come as a shock to anyone: everyone is aware that at least a 15-30% of all reviews on the Internet should be taken with a grain of salt, so to speak.
As a web project manager working for a public commercial company I have an understanding of how you should approach paid reviews, how one should create new virtual users and make the best use of those accounts as well as existing ones.
And I can definitely tell you this is not how you do it.
NVIDIA and its partners, as well as AAA-developers and game engine gurus like Epic Games, keep throwing their impressive demos at us at an accelerating rate.
These feature the recently announced real-time ray tracing tool-set of Microsoft DirectX 12 as well as the (claimed) performance benefits proposed by NVIDIA's proprietary RTX technology available in their Volta GPU lineup, which in theory should give the developers new tools for achieving never before seen realism in games and real-time visual applications.
There's a demo by the Epic team I found particularly impressive:
Looking at these beautiful images one can expect NVIDIA RTX and DirectX DXR to do more than they are actually capable of. Some might even think that the time has come when we can ray trace the whole scene in real-time and say good bye to the good old rasterization.
There's an excellent article available at PC Perspective you should definitely check out if you're interested in the current state of the technology and the relationship between Microsoft DirectX Raytracing and NVIDIA RTX, which without any explanation can be quite confusing, seeing how NVIDIA heavily focuses on the native hardware-accelerated tech which RTX is, whist Microsoft stresses out that DirecX DXR is an extension of an existing DX tool-set and compatible with all of the future certified DX12-capable graphics cards (since the world of computer graphics doesn't revolve solely around NVIDIA and its products, you know).
So here I am to quickly summarize what RTX and DXR are really capable of at the moment of writing and what they are good (and not so good) for.
Whenever I hear the term "real-time ray-tracing", I immediately think of some of the earlier RTRT experiments done by Ray Tracey and OTOY Brigade. You know, those impressive, yet noisy and not quite real-time demos with mirror-only reflections and lots a lots of convolving noisy rendered frames.
Like this one:
I wouldn't dream of seeing something actually ray-traced in real-time at 30fps without any noise in the upcoming 5-10 years. Little did I know, NVIDIA and Microsoft had the same idea and put their best minds to the task.
This demo, developed by EA (believe it or not) is running on NVIDIA's newest lineup of VOLTA GPUs, which means that VOLTA is also on the way! Yay! NVIDIA RTX tech sure looks promising.
Can you imagine what will happen to offline CUDA ray-tracers following this announcement? Hopefully their devs will be able to make this amazing tech a part of the rendering pipeline ASAP. Otherwise, C'est la vie: you've been REKT by a real-time ray-tracing solution.
Just kidding. We gotta test this thing out first and only then will be able to tell whether we've been led to believe in yet another fairy tale or that you need like 8 VOLTAS to run this demo which would be a let down.
I'm starting to question my life choices...
I mean, I've managed all kinds of projects, produced countless videos, published a small game, decided to work on an animated feature film. All for one thing: to make some noise, get noticed and maybe even make a buck or two on the way. You know, the basics.
Alas, trapped within the confounds of my inflexible mind and obsolete world view I would never be able to come up with something as beautiful and inspiring as this:
Please, take a deep breath. Pause. Then repeat this once again, proud and aloud, and let it sink in:
SUCK YOUR WAY TO PANTSU PARADISE!
This is exactly what you mean when you declare:
This is impressive. You need a special mind-set to come up with something as bizarre as this franchise and bring these games to fruition in the form of a real commercial projects and actually sell them.
But not just sell them. No-no-no! Sell them with a DLC. Yeah, just an innocent tiny $90 DLC. Which, naturally, gives the lucky player the power to zoom in and undress the characters.
Am I going to play this... game? Doubt it. Am I impressed by the mere fact that such a project exists and sells more or less well? Hell yeah!
Basic instincts, man... That's where it's at. Exploit and prosper. Seems like PQUBE LTD nailed it (he-he, get it?) when it comes to providing a quality product for their target audience. If it sells, it sells. Simple as that.
Screw the film! Forbes 100, here I come!
This year at GDC Vicon will be demonstrating its new Shōgun 1.2 MoCap platform. At its core it's an optical MoCap solution (which is quite different from inertial ones like Perception Neuron and is commonly used in professional production).
Vicon's Shogun and Shōgun Live are well known MoCap solutions used by Hollywood filmmakers and AAA-game devs worldwide.
This time the company comes to GDC with a treat: they will be coupling Shōgun with VR headsets to immerse GDC visitors into interactive virtual worlds in what they are calling a "VR multiplayer escape room".
What's even cooler is as far as I can tell, all MoCap data will be directly streamed into Unreal Engine 4. I believe this will make the engine even more popular among game devs, especially those interested in VR applications (sorry, Unity).
Anyway, if this is something you might be interested in, you can read more about the event at Vicon's official website.
The new TMNT makes me so happy, yay! Splinter is my favorite. Who's yours?