我对iPhone或库不太了解,并且似乎有许多层和不同类型的UI元素,因此我可能需要很多解释...
目前,我只会不断更新opengl中的纹理并将其渲染到屏幕上,我非常怀疑这不是最好的方法。
更新:
我尝试过OpenGL屏幕大小的纹理方法:
我得到了17fps...
我使用了512x512的纹理(因为它需要是2的幂)
只是调用
glTexSubImage2D(GL_TEXTURE_2D,0,0,0,512,512,GL_RGBA,GL_UNSIGNED_BYTE, baseWindowGUI->GetBuffer());
似乎主要是由于这个代码导致了所有的缓慢。
注释掉这段代码,保留所有软件渲染GUI代码以及现在不更新纹理的渲染,结果是60fps,30%的渲染器使用率,并且没有从CPU中出现明显的峰值。
请注意,GetBuffer()只是返回GUI系统的软件后备缓冲区的指针,在任何情况下都不会重新调整或调整缓冲区的大小,它已经被正确地调整和格式化为纹理,所以我相当确定减速与软件渲染器无关,这是好消息,看起来如果我能找到一种方法以60帧更新屏幕,则软件渲染应该可以使用。
我尝试使用512,320而不是512,512进行更新纹理调用,这更奇怪的是更慢......以10fps运行,同时还显示呈现利用率仅约为5%,所有时间都浪费在openGLES中的Untwiddle32bpp调用中。
如果更直接的blit将产生更好的效果,我可以将我的软件渲染器更改为原生渲染任何像素格式。
提供信息,测试是在2.2.1版iPod Touch G2上进行的(因此像是一个类固醇的iPhone 3G)
更新2:
我刚刚完成了CoreAnimation/ Graphics方法的编写,看起来不错,但我有点担心它如何每帧更新屏幕,基本上放弃旧的CGImage,创建一个全新的...在下面的“一些随机函数”中请查看:
这是更新图像的最快方式吗?任何帮助将不胜感激。
//
// catestAppDelegate.m
// catest
//
// Created by User on 3/14/10.
// Copyright __MyCompanyName__ 2010. All rights reserved.
//
#import "catestAppDelegate.h"
#import "catestViewController.h"
#import "QuartzCore/QuartzCore.h"
const void* GetBytePointer(void* info)
{
// this is currently only called once
return info; // info is a pointer to the buffer
}
void ReleaseBytePointer(void*info, const void* pointer)
{
// don't care, just using the one static buffer at the moment
}
size_t GetBytesAtPosition(void* info, void* buffer, off_t position, size_t count)
{
// I don't think this ever gets called
memcpy(buffer, ((char*)info) + position, count);
return count;
}
CGDataProviderDirectCallbacks providerCallbacks =
{ 0, GetBytePointer, ReleaseBytePointer, GetBytesAtPosition, 0 };
static CGImageRef cgIm;
static CGDataProviderRef dataProvider;
unsigned char* imageData;
const size_t imageDataSize = 320 * 480 * 4;
NSTimer *animationTimer;
NSTimeInterval animationInterval= 1.0f/60.0f;
@implementation catestAppDelegate
@synthesize window;
@synthesize viewController;
- (void)applicationDidFinishLaunching:(UIApplication *)application {
[window makeKeyAndVisible];
const size_t byteRowSize = 320 * 4;
imageData = malloc(imageDataSize);
for(int i=0;i<imageDataSize/4;i++)
((unsigned int*)imageData)[i] = 0xFFFF00FF; // just set it to some random init color, currently yellow
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
dataProvider =
CGDataProviderCreateDirect(imageData, imageDataSize,
&providerCallbacks); // currently global
cgIm = CGImageCreate
(320, 480,
8, 32, 320*4, colorSpace,
kCGImageAlphaNone | kCGBitmapByteOrder32Little,
dataProvider, 0, false, kCGRenderingIntentDefault); // also global, probably doesn't need to be
self.window.layer.contents = cgIm; // set the UIWindow's CALayer's contents to the image, yay works!
// CGImageRelease(cgIm); // we should do this at some stage...
// CGDataProviderRelease(dataProvider);
animationTimer = [NSTimer scheduledTimerWithTimeInterval:animationInterval target:self selector:@selector(someRandomFunction) userInfo:nil repeats:YES];
// set up a timer in the attempt to update the image
}
float col = 0;
-(void)someRandomFunction
{
// update the original buffer
for(int i=0;i<imageDataSize;i++)
imageData[i] = (unsigned char)(int)col;
col+=256.0f/60.0f;
// and currently the only way I know how to apply that buffer update to the screen is to
// create a new image and bind it to the layer...???
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
cgIm = CGImageCreate
(320, 480,
8, 32, 320*4, colorSpace,
kCGImageAlphaNone | kCGBitmapByteOrder32Little,
dataProvider, 0, false, kCGRenderingIntentDefault);
CGColorSpaceRelease(colorSpace);
self.window.layer.contents = cgIm;
// and that currently works, updating the screen, but i don't know how well it runs...
}
- (void)dealloc {
[viewController release];
[window release];
[super dealloc];
}
@end