-
Posts
8672 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by The Bard
-
You're right, Ubisoft likely won't care, which is why I think the "mass boycott" aspect of this discussion is ridiculous to begin with, and also why I wasn't really addressing it, but thinking more along the lines of whether this is a good precedent for consumers. As for it being a small difference, did you for example, not notice the difference between the PS2 and GC versions of RE4? It's literally worse than that. No offence meant to Mr-Paul either, it's more a general frustration that I find little is really talked about here outside the business aspect. Anyway, is anyone actually excited for this or Far Cry? I feel like I've seen so little that makes a difference in these games other than the application of Cliff Bleszinski's much overused catchphrase.
-
The whole point of this thread is that it wouldn't take extra work, since this is a case of enforced parity, not natural parity. Look to every other instance of multi-platform release this generation and you'll see that with both versions released on the same date, the PS4 version always performs at better frame rates and at higher resolutions. Ubisoft are doing this just because Microsoft are somehow incentivising it or coercing them, not because the design team figure "hey, let's hamstring this game we've spent the last three years making." Also, my argument is that I don't actually care if it takes extra work. Maybe it does - but if everyone else can do it, so can Ubisoft. If the XBone runs it at 900p, I guarantee you that the PS4 can run it at 1080p (With a minimum of extra work, since the PS4s GPU is mathematically 50% more powerful). You're saying that it must be "easier" for them as if that amounts to any sort of argument other than that we should let their shit slide, because laziness is some sort of virtue. This is the problem with this forum; almost everyone looks at games from the perspective of sales, or the business angle, as if that's the sole lens through which you can understand games.
-
@Jonnas, it's not a battle because Ubisoft are going to do whatever they want. It's a case of consumers choosing not to spend their money on damaged goods. Also as for 900p not being a third less than 1080p? Do the maths: 1920x1080 = 2,073,600 1600x900 = 1,440,000 633,600 pixels is the difference. 633,600/2,073,600 = 0.305. Very slightly less than a third.
-
I already have a PC, which is where I'll probably be buying this if I buy it at all. Expecting a company not to play politics with shitty proprietors isn't me being entitled dude, it's me distinguishing between "is" and "aught." It boggles the mind how quickly people jump to the defence of large companies that make recycled, low innovation "triple a" software when anyone expects anything of them. The Window's 3.1 comment was me pointing out how useless a discussion about Ubisoft's "entitlements" is. Edit: It's the same problem with PC by the way. If you have and AMD Gpu, ACIV was woefully unoptimised for it compared to Nvidia. I'm pretty much done with Ubi's same old output anyway.
-
Right, but then again, Ubisoft are totally entitled to do whatever they want, including making AC Unity a 16bit Windows 3.1 Exclusive that runs in 320x240. I don't give a shit about their entitlements. And I'm not calling for a mass boycott - anyone who does so for a videogame (short of the Orson Scott Card Shadow Complex shit) is self evidently the most futile person on the planet. The thing is, this is Assassin's Creed. I've played these fucking games to death, and there is almost nothing that I expect the new one to offer me, with the exception of a good looking world I can prance around in. I do - like many people - play a lot of games because of the art work, and the troubles these people go to in creating these worlds. Maybe it's something that comes out of having spent years of my life in school analysing the minutest brush strokes in art class, but this stuff does make a difference to me. If I'm playing these games for little more than the environment, then why would I want a shitty version of that?
-
What difference does that make? Even if they won't notice, we - a sizeable portion of the gaming population - will. My whole problem with the posts above is that we're being expected to ignore our ability to notice these details just because the majority wouldn't. Well so what if they wouldn't? That's no reason why the people who do notice it shouldn't demand it, especially since it takes almost zero effort to implement. I spend my limited time on entertainment software, so why should I waste it on whatever designed-by-committee bullshit is trotted out for the lowest common denominator, and why should I cede to arguments that that's the best I should expect?
-
Why does it matter that "most customers" wouldn't care? Are "most" customers the guys that spend countless hours of their lives on message boards discussing this shit? Why do you expect us to care about "most consumers"? The difference between 900p and 1080p is about a third of the pixel count, which is very very noticeable, not only because of the detail lost, but also because of the increased aliasing. So when you say the difference between the consoles is negligible you'd be right in that I'm sure any game that can run on a PS4 could also run on an XBone, but to actively ask people to be less discerning when their console choice might have been directly influenced by its comparative graphical ability, is dumb. In hardware terms, the PS4 gpu is about 50% more powerful than the Xbone's. A publisher marring the disparity between the two to avoid whatever discussion might occur, or to placate the proprietor of the weaker console is just shoddy, and adds the the impression that Microsoft have little to add besides trying to brute force their way through this gen using their bank balance as a battering ram.
-
Yeah...at this point my PS4 is a video streaming device. Haven't played anything on it since Last of Us.
-
Substitute the word cable for converter box. I could swear there were some integrated converter cables - although a look on Amazon tells me they're fake as balls. Oops .
-
Optical converters are super fucking expensive as well. You could get an HDMI to Component cable instead though, would be a lot cheaper than a new monitor.
-
I'm not sure what I want to play right now between this and Shadow of Mordor. The latter seems like a pastiche of play styles borrowed from recent games, but this seems actually unique. Has anyone here played it? Also Creative Assembly can go back to making Total War games. Need a sequel to Empire right the fuck now.
-
Lol Kamiya seems like such a bro
-
Oh fuck I'm getting a bit of a boner
-
I went to see Boyhood and now I can never see it for the first time again.
-
Wait wait, I thought that whole name thing with @Fierce_LiNk was obviously an elaborate joke because that was pretty much the only way you could make a name that has Jim on one end and Babooooooor on the other even more ridiculous?
-
Just seems like the most boneheaded, clueless move imaginable.
-
Well it's more a protection against losing a chunk of that money through fried components; the body can hold a greater charge than any component in a computer is capable of handling - one static discharge in the wrong place and there goes your motherboard.
-
Right, in a game where you're dealing with the predictable movements of a horde of enemy AI as opposed to needing the fluid motion and faster controller polling rate of a 60fps shooter to track the erratic movements of a player antagonist? (PvP notwithstanding - because the vs multiplayer in this game is truly poor anyway). It's really not that important to have a a Call of Duty poll and framerate for this game especially since it would come at the expense of visual grandeur in its already very aliased and sparsely detailed world. The framerate angle is just a poor one to attack this game from. Also re. opinions. Don't use that crutch - if we bring up epistemology, everything is an opinion for which we have varying degrees of evidence and ability to track the truth. If we look at game design, it relies on the ability of the designer to understand and manage the player's needs and expectations: it's a very psychological art. And part of that means you need to understand on aggregate what level of fluidity the greatest breadth of people that are likely to play your game will be comfortable with. Evidently, in the case of cinematic FPS games 30 frames has become a standard for a reason - people are willing to trade visual beauty for a more responsive feel because you don't need the additional fluidity to deal with non-player antagonists and secondly because the added sluggishness adds to the look of it. They made the 30fps choice because it's what most people want, and it's become a standard for a reason. If you don't like that then you're going to be consigned to either playing games that are conceived as mostly mechanical, or to building a gaming PC.
-
I'd agree that 60 fps would be better if Destiny were a twitch heavy, reflex based shooter where aim is as important as managing abilities, keeping track of power cooldowns and controlling large crowds of enemies for whom a central mass aim is a better bet than aiming for the head. But it's not that type of game, so it really doesn't matter much. That coupled with the fact that they are going for a cinematic experience, which 30fps lends itself to a lot better than 60 (compare any film shot in 24 frames compared to something like Public Enemies, for example. The latter looks like a YouTube video whereas the motion in most films is given a weighty inertia by way of the reduction in frames) leaves 30fps as a good choice for this game. Yes, I think there are differences between films and games, and some of this argument is lost from a first person perspective, especially in a game that's about play rather than narrative and animation, but that's another conversation. To be honest though, I just don't think it's a good game, so this is a pointless conversation about some boring frame stat that's being unnecessarily fetishised.
-
I've got the 290. It's about a hundred pounds cheaper and has a benchmark difference of about one or two frames from the 290x, but it's still not worth the upgrade from 280x. That card'll be fine for 1080p gaming for a couple years and you can upgrade later on seeing as the rest of your components will be good for a long while..