Advertising (This ad goes away for registered users. You can Login or Register)

How to use 3D of the GPU?

Open discussions on programming specifically for the PS Vita.
Forum rules
Forum rule Nº 15 is strictly enforced in this subforum.
Post Reply
John Dupe
Posts: 79
Joined: Sun Jun 28, 2015 9:24 pm

How to use 3D of the GPU?

Post by John Dupe » Wed Aug 10, 2016 6:26 pm

I am fluent in c++ and have made a simple pong demo but I've already done many 2D games with PC and DS. Can anyone tell me anything about registers/address locations for using the GPU's 3D functions?
Advertising
Stuff I've done:
libvita3d

wonre
Posts: 32
Joined: Sat Mar 03, 2012 8:22 am

Re: How to use 3D of the GPU?

Post by wonre » Fri Aug 12, 2016 1:01 pm

for 2d you must master vita2dlib, and i guess you must compile it yourself
here are some notes to achieve that if you use msys2 under win10 :
viewtopic.php?f=116&t=46182

for 3d I will investigate, check for any restult there weekly/monthly :
viewtopic.php?f=54&t=46495
Advertising

MrSilverstone
Posts: 11
Joined: Mon Jun 29, 2015 7:35 pm

Re: How to use 3D of the GPU?

Post by MrSilverstone » Sat Aug 20, 2016 8:42 am

Just look at the vita2dlib for context initialisation. Then use your own shaders an setup matrix for 3d rendering. You can use this for compiling shaders https://github.com/xyzz/vita-shaders (you will need an arm version of gcc and an arm emulator).

John Dupe
Posts: 79
Joined: Sun Jun 28, 2015 9:24 pm

Re: How to use 3D of the GPU?

Post by John Dupe » Sun Aug 21, 2016 7:43 pm

MrSilverstone wrote:Just look at the vita2dlib for context initialisation. Then use your own shaders an setup matrix for 3d rendering. You can use this for compiling shaders https://github.com/xyzz/vita-shaders (you will need an arm version of gcc and an arm emulator).
THANK YOU!!!! So can I just call vita2d_init() and setup shaders and matrices, or should I do the context initialization myself?
Stuff I've done:
libvita3d

MrSilverstone
Posts: 11
Joined: Mon Jun 29, 2015 7:35 pm

Re: How to use 3D of the GPU?

Post by MrSilverstone » Mon Aug 22, 2016 6:29 am

I looked at the libvita2d sources and the initialisation of the lib also initialises some shaders needed by the lib. It should not be a problem, but I think that initializing the context by yourself would be a cleaner method.
I tried to create a small 3d renderer : https://github.com/MrSilverstone/vita3d

John Dupe
Posts: 79
Joined: Sun Jun 28, 2015 9:24 pm

Re: How to use 3D of the GPU?

Post by John Dupe » Mon Aug 22, 2016 10:45 pm

MrSilverstone wrote:I looked at the libvita2d sources and the initialisation of the lib also initialises some shaders needed by the lib. It should not be a problem, but I think that initializing the context by yourself would be a cleaner method.
I tried to create a small 3d renderer : https://github.com/MrSilverstone/vita3d
Cool! My approach is modifying libvita2d to support 3d (it already uses 3d hardware with an ortho projection).
It seems vita2d already has a perspective matrix init function, but I can't get it to work.
Stuff I've done:
libvita3d

MrSilverstone
Posts: 11
Joined: Mon Jun 29, 2015 7:35 pm

Re: How to use 3D of the GPU?

Post by MrSilverstone » Wed Aug 24, 2016 4:36 am

How did you setup your view matrix?

John Dupe
Posts: 79
Joined: Sun Jun 28, 2015 9:24 pm

Re: How to use 3D of the GPU?

Post by John Dupe » Wed Aug 24, 2016 5:28 pm

MrSilverstone wrote:How did you setup your view matrix?
That's part of my problem. I couldn't get the lib to work, and I'm still trying to understand 3D rendering as a whole. Aside from that, I'll give GLM a try and use the matrices from that.

EDIT: glm::lookAt() seems to work.
Stuff I've done:
libvita3d

Post Reply

Return to “Programming and Security”