User Controls
SSB Melee Is The Greatest Game Ever Made
-
2018-03-11 at 4:38 AM UTC
-
2018-03-11 at 4:50 AM UTCNo.
Shenmue 2 will forever be the greatest game of all time. -
2018-03-11 at 6:08 AM UTC
-
2018-03-11 at 4:17 PM UTC
Originally posted by Enter https://niggasin.space/thread/21695
Didn't click -
2018-03-11 at 4:22 PM UTC
Originally posted by Lanny They can't do it, Sakurai's involved now and the tacit admission that the game he turned his back on is the best in the series can't be publicly made.
I like Mashpotato Samurai but I honestly can't wait for him to just stop being the lead voice on Smash (maybe keep him on as a supervisor and consultant).
I remember before Brawl there was an interview with Iwata and maybe him (since they both worked at HAL Labs) about what they would have done if Sakurai refused to sign on for Brawl (apparently Melee was a brutal dev cycle), and he said that Nintendo would probably have taken what they had (Melee) and built on it. That sounds amazing, actually: imagine Melee with the roster of Sm4sh and Brawl.
We'd also probably have a stronger official competitive scene if Sakurai wasn't so anti-competition. -
2018-03-11 at 5:31 PM UTCHonestly, who the fuck plays nintendo games? It's so not worth buying a system for the handful of good ones.
-
2018-03-11 at 6:25 PM UTC
-
2018-03-11 at 7:46 PM UTC
-
2018-03-11 at 8:07 PM UTC
Originally posted by §m£ÂgØL2becausescronfuckeduptheforum Honestly, who the fuck plays nintendo games? It's so not worth buying a system for the handful of good ones.
People who like nintendo games? I bet you have exactly one console from either MS or Sony. When you have a PC playing games on consoles looks a lot different. You buy a console either for exclusive titles or because it offers some kind of different experience from PC gaming (e.g. non-networked multiplayer, novel peripherals). On both counts Nintendo absolutely destroys the competition, they have a ton of franchises I love that get a release on almost every platform and they're always trying new weird shit MS/Sony wouldn't dare. And I mean I'm the first to admit I think a lot of wii/wii U stuff, and to a lesser degree switch, is gimmicky but I mean at least they're trying something new instead of releasing the same console with upgraded hardware specs every couple of years. And Nintendo's stuff, both platforms and games, are technically waaaay more interesting. Reverse engineering studies of GameCube games and just the GC design is like reading a science fiction novel, people did fucking insane graphics wizardry to get that kind of performance out of a .5GHz processor and that GPU, some of it is nothing short of art. -
2018-03-11 at 8:22 PM UTCThe gamecube was the last Nintendo console I liked
Haven't tried the wii u or switch though, but upon reading about them I can't see anything that I'd like
I don't like change when it comes to video games, I like buttons and triggers and joysticks only, steering wheels being the only exception -
2018-03-11 at 9:30 PM UTC
-
2018-03-11 at 9:31 PM UTC
-
2018-03-11 at 9:36 PM UTC
Originally posted by Lanny People who like nintendo games? I bet you have exactly one console from either MS or Sony. When you have a PC you can play current-gen games on consoles look a lot different. You buy a console either for exclusive titles or because it offers some kind of different experience from PC gaming (e.g. non-networked multiplayer, novel peripherals). On both counts Nintendo absolutely destroys the competition, they have a ton of franchises I love that get a release on almost every platform and they're always trying new weird shit MS/Sony wouldn't dare. And I mean I'm the first to admit I think a lot of wii/wii U stuff, and to a lesser degree switch, is gimmicky but I mean at least they're trying something new instead of releasing the same console with upgraded hardware specs every couple of years. And Nintendo's stuff, both platforms and games, are technically waaaay more interesting. Reverse engineering studies of GameCube games and just the GC design is like reading a science fiction novel, people did fucking insane graphics wizardry to get that kind of performance out of a .5GHz processor and that GPU, some of it is nothing short of art.
so ur nintendo is like a 4 cylinder honda that u geek out on tuning and meanwhile im driving a big bad v8 over 100mile per hours?
very interesting -
2018-03-11 at 10:13 PM UTC
-
2018-03-12 at 12:07 AM UTC
Originally posted by A College Professor so ur nintendo is like a 4 cylinder honda that u geek out on tuning and meanwhile im driving a big bad v8 over 100mile per hours?
very interesting
Where are you getting this from? The gamecube has a slow processor because it's a 17 year old console. Nintendo has been fairly competitive with sony/MS in terms of hardware in each generation, although cross-architecture comparisons are always difficult.
In any case, PCs have almost always had better cost/performance characteristics (there are exceptions to this but it doesn't last long since consoles are pegged to a collection of hardware in a way modular PCs are not) so if you want some kind of "big bad v8" thing consoles are automatically out.
GC was technically interesting because its architecture allowed for a different more open-ended bag of graphics tricks than other architectures. You can do a magic with the traditional main/video split ram setup but with the unified SRAM setup the game is totally different. It's not that the base specs are poor, it's that it has a much higher ceiling in terms of what you could buy with big tech investments. Compare windwaker or twilight princess with like anything that came out on the PS2 graphically. -
2018-03-12 at 3:21 AM UTC
Originally posted by Lanny Where are you getting this from? The gamecube has a slow processor because it's a 17 year old console. Nintendo has been fairly competitive with sony/MS in terms of hardware in each generation, although cross-architecture comparisons are always difficult.
In any case, PCs have almost always had better cost/performance characteristics (there are exceptions to this but it doesn't last long since consoles are pegged to a collection of hardware in a way modular PCs are not) so if you want some kind of "big bad v8" thing consoles are automatically out.
GC was technically interesting because its architecture allowed for a different more open-ended bag of graphics tricks than other architectures. You can do a magic with the traditional main/video split ram setup but with the unified SRAM setup the game is totally different. It's not that the base specs are poor, it's that it has a much higher ceiling in terms of what you could buy with big tech investments. Compare windwaker or twilight princess with like anything that came out on the PS2 graphically.
Could you go into detail about what makes the GameCube such a fantastic piece of hardware? I always find those kinds of breakdowns very interesting. I've always heard that the GameCube was a piece of hardware because it's architecture was so good.
Something like the Metroid Prime or Star Wars Rogue Squadron, or FZero GX games just blow anything on the PS2 away graphically. -
2018-03-12 at 3:28 AM UTCaparently its so good cause its pretty underpowered hardware and he likes it because some devs did a lot of work to squeeze some good performance out of a turd
-
2018-03-12 at 4:36 AM UTC
-
2018-03-12 at 8:43 AM UTC
Originally posted by Jeremus Could you go into detail about what makes the GameCube such a fantastic piece of hardware? I always find those kinds of breakdowns very interesting. I've always heard that the GameCube was a piece of hardware because it's architecture was so good.
The most interesting thing to me is using the same memory for the GPU and CPU. Like one of the first things you learn about modern GPUs when you start into graphics programs is that GPU memory is across a bus, that it's expensive to communicate with, and consequently you do it infrequently and in small volume. Like that's a big part of loading screens, you take all those textures and static geometry from main memory (or disk) and schlep it over to the GPU and don't touch it again. If you need to change it, you try to figure out a way to do it on the GPU since the only thing you can afford to send over the bus is model data (like "what is the state of the visual world" in abstract terms, things like "where is the camera", "which frame of animation is X in", "Is Y a billboard or should I render it for real") not actual "how should this thing look" data. The idea that you can change "VRAM" mid-frame is weird, you usually never have cause to do it, but you see some nutty water effects that use it because you can use "true" random values instead of the very limited "shuffle-random" available to the CPU. And you can rewrite any texture dynamically as long as the change takes less than your processor speed over the frame rate seconds so things like, whatever, bullet holes or blood decals or what have you aren't actually "decals" at all, they're dynamically calculated in each instance and effectively free instead of a deformed sticker you apply to other textures at some cost to render speed. -
2018-03-12 at 8:45 AM UTC
Originally posted by Lanny The most interesting thing to me is using the same memory for the GPU and CPU. Like one of the first things you learn about modern GPUs when you start into graphics programs is that GPU memory is across a bus, that it's expensive to communicate with, and consequently you do it infrequently and in small volume. Like that's a big part of loading screens, you take all those textures and static geometry from main memory (or disk) and schlep it over to the GPU and don't touch it again. If you need to change it, you try to figure out a way to do it on the GPU since the only thing you can afford to send over the bus is model data (like "what is the state of the visual world" in abstract terms, things like "where is the camera", "which frame of animation is X in", "Is Y a billboard or should I render it for real") not actual "how should this thing look" data. The idea that you can change "VRAM" mid-frame is weird, you usually never have cause to do it, but you see some nutty water effects that use it because you can use "true" random values instead of the very limited "shuffle-random" available to the CPU. And you can rewrite any texture dynamically as long as the change takes less than your processor speed over the frame rate seconds so things like, whatever, bullet holes or blood decals or what have you aren't actually "decals" at all, they're dynamically calculated in each instance and effectively free instead of a deformed sticker you apply to other textures at some cost to render speed.
and so x and y are veriables?