TweetFollow Us on Twitter

Lights, Camera, Action...

Volume Number: 14 (1998)
Issue Number: 5
Column Tag: Multimedia

Lights, Camera, Action...

by Tim Monroe, Apple Computer, Inc.

Embedding QuickDraw 3D objects into QuickTime VR panoramas

Introduction

QuickTime VR is a wonderful medium for immersing the user in photorealistic or rendered virtual environments, but it doesn't take too terribly long before the static and silent nature of the experience becomes apparent. By "static" I mean that, for the most part, there's no motion, no life, in the virtual environment. Understandably, one of the most common requests that the QuickTime VR team has gotten from developers is for a way to embed sounds and moving objects into QuickTime VR scenes. Although the virtual environments provided by QuickTime VR are quite compelling all by themselves, they just spring to life when even small bits of sound or fleeting bits of motion are added to them.

Adding motion and sound to QuickTime VR object movies isn't very difficult and doesn't require any programming You can infuse some motion into a VR scene by creating what are called "animated" object movies, where a given pan and tilt angle is associated not with a single frame, but with a set of frames, which are played in sequence (and possibly also looped) when the user is at that particular pan and tilt angle. (This is called "frame animation".) Similarly, you can configure any object movie to automatically play in sequence all the views of the current row in the object movie. (This is called "view animation".) In addition, the author of a VR scene can include sound tracks in the movie file. If a sound track is located at the same time as an object node, the VR movie controller automatically plays the sound track when that node is the active node.

But for panoramas, which are by far the most common type of VR movies, there is nothing analogous to frame or view animation, and the movie controller simply ignores any sound track whose duration overlaps that of the panoramic node. To embed sounds and motion into panoramas, you'll need to do some programming. In a previous MacTech article (Monroe and Wolfson, July 1997), we showed how to play sounds that appear to come from specific locations in a panorama or that are ambient in the panorama (emanating from no particular location). In this article, I'll show how to embed rendered QuickDraw 3D objects in a panorama.

There are several obvious uses for this technique. First, you might want to populate a panorama with various objects that move over time. Imagine a panorama with a rendered jet flying by in the sky, or a rendered carousel that spins on its axis in the middle of a panorama. These effects are easy to achieve by taking an existing 3D model, embedding it into a panorama, and then dynamically altering its position or rotation over time. Another use for embedding QuickDraw 3D objects is to serve as a "screen" on which to play QuickTime. QuickDraw 3D allows you to map a texture onto a 3D object; this texture can even change over time. So, we can use the individual frames of a QuickTime movie as a texture for a 3D object. The result is a QuickTime movie superimposed onto the 3D object. With a small amount of trial and error to get the placement of the 3D "screen" just right, you can play QuickTime movies on top of a TV screen in a panorama, for instance. It's also possible to drop out a solid background of a QuickTime movie when mapping it as a texture and thus get a kind of "blue screening" (perhaps to have people walking around inside the panorama).

The basic approach that we'll use to integrate rendered QuickDraw 3D objects into QuickTime VR panoramas is really no more complicated than the one we used previously to integrate directional sounds into panoramas: first, we define an arbitrary correspondence between the QuickDraw 3D coordinate space and the QuickTime VR panorama space. Then we translate changes in the panorama's pan and tilt angles into changes in the 3D camera. The actual implementation of this approach, however, is vastly more complicated with 3D than with sound, primarily because we need to do all the standard 3D setup and rendering, in addition to then embedding that rendered image into the VR panorama. Here we'll describe the general process and give the code for some of the key steps. See the source code for the VR3DObjects sample application for the complete details.

Before reading this article, you should already have read the article mentioned above. That article provides a good overview of the capabilities of the QuickTime VR programming interfaces and shows how to use them to perform some simple actions. You should also be familiar with QuickDraw 3D. See the first chapter of the book 3D Graphics Programming With QuickDraw 3D for a quick overview of how to use QuickDraw 3D. Also, develop magazine has printed numerous good articles describing various parts of QuickDraw 3D; consult the Bibliography at the end of this article for a list of some of those articles.

Lights (etc.)

First, let's briefly discuss the basic QuickDraw 3D setup. As I just mentioned, I'm assuming you're already familiar with QuickDraw 3D or with some similar 3D graphics system. It's beyond the scope of this article to explain everything you need to know to use QuickDraw 3D; the following brief recap is intended only to jog your memory.

All 3D rendering is done using a private data structure called a view. A view is really nothing more than a collection of other objects, including a camera, a set of lights, a renderer, a draw context, and the 3D model. The model specifies the location and geometric shape of the object (or objects) to be rendered, as well as information about how the renderer should apply the illumination from the lights (called illumination shading) and about what if any texture is to be applied to the surface of the object (called texture shading).

The renderer determines how the geometric description of the model is converted into a graphical image (for instance, is the model drawn in a wireframe outline of its surfaces or as a collection of colored, shaded surfaces?). The renderer also determines which parts of a model are drawn and which are obscured by other surfaces.

The lights and the camera associated with a view are pretty much just what you'd expect. The lights provide illumination to the objects in the model. A view can have one or more lights of varying positions and colors. The camera determines how the rendered model is projected onto a flat screen (called the "view plane"). QuickDraw 3D supports several kinds of cameras, which are distinguished by their methods of projection.

Perhaps the least intuitive part of a view is the draw context, which maintains information about a particular drawing destination. You can use QuickDraw 3D to draw directly into Macintosh windows, or Microsoft Windows windows, or even into a pixel map (a "pixmap"), a region of memory that is not directly associated with a window. The draw context maintains general information common to all drawing destinations (such as the color to use when erasing the drawing destination) and specific information about a particular type of drawing destination (for instance, the pixel type and size of an offscreen graphics world).

For present purposes, we want to have QuickDraw 3D draw into an offscreen graphics world, giving us an image that we can later superimpose on the panorama. We're going to superimpose the image by copying it from that offscreen graphics world into the panorama's prescreen buffer (the buffer that contains the unwarped panoramic image that is about to be copied to the screen). QuickTime VR then automatically copies the prescreen buffer to the screen. Figure 1 shows the flow of pixels.

Figure 1. From geometric description to a screen image

Accordingly, our draw context will be a pixmap draw context. First we need to create an offscreen graphics world to hold the pixmap. Clearly, the size of the pixmap should be the same as the size of the QuickTime VR movie.

GetMovieBox((**theWindowObject).fMovie, &myRect);
QTNewGWorld(&(**myAppData).fPixGWorld, kOffscreenPixelType, 
              &myRect, NULL, NULL, 0L);

The QTNewGWorld function is a version of NewGWorld that allows you to specify the pixel type of the offscreen graphics world. The pixel type depends on whether we're running on Mac OS or Windows: for Mac OS we use the value k32ARGBPixelFormat and for Window we use the value k32BGRAPixelFormat. Now that we've created an offscreen graphics world of the correct size and pixel type, we can create a pixmap draw context, as shown in Listing 1.

Listing 1

CreateDrawContext 
TQ3DrawContextObject CreateDrawContext (GWorldPtr theGWorld)
{
  TQ3DrawContextObject        myDrawContext = NULL;
  TQ3PixmapDrawContextData    myPMData;
  TQ3DrawContextData        myDCData;
  PixMapHandle               myPixMap;
  Rect                      myRect;
  TQ3ColorARGB              myClearColor;
  float                    myFactor = 0xffff;
  
  if (theGWorld == NULL) return(myDrawContext);
    
  // set the background color;
  // note that RGBColor is defined in the range 0-65535,
  // while TQ3ColorARGB is defined in the range 0.0-1.0; hence the division....
  myClearColor.a = 0.0;
  myClearColor.r = kClearColor.red / myFactor;
  myClearColor.g = kClearColor.green / myFactor;
  myClearColor.b = kClearColor.blue / myFactor;
  
  // fill in draw context data
  myDCData.clearImageMethod = kQ3ClearMethodWithColor;
  myDCData.clearImageColor = myClearColor;
  myDCData.paneState = kQ3False;
  myDCData.maskState = kQ3False;
  myDCData.doubleBufferState = kQ3False;
 
  myPMData.drawContextData = myDCData;
  
  // the pixmap must remain locked in memory for as long as it exists
  myPixMap = GetGWorldPixMap(theGWorld);
  LockPixels(myPixMap);

  myRect = theGWorld->portRect;
  
  myPMData.pixmap.width = myRect.right - myRect.left;
  myPMData.pixmap.height = myRect.bottom - myRect.top;
  myPMData.pixmap.rowBytes = (**myPixMap).rowBytes & 0x3fff;
  myPMData.pixmap.pixelType = kQ3PixelTypeRGB32;
  myPMData.pixmap.pixelSize = 32;
  myPMData.pixmap.bitOrder = kQ3EndianBig;
  myPMData.pixmap.byteOrder = kQ3EndianBig;
  
  myPMData.pixmap.image = GetPixBaseAddr(myPixMap);
  
  // create a draw context and return it
  myDrawContext = Q3PixmapDrawContext_New(&myPMData);
  return(myDrawContext);
}

We're going to superimpose the image in the pixmap draw context onto the prescreen buffer by calling CopyBits. Obviously, we want to copy only the parts of the draw context that contain rendered pixels, not the parts that are merely background (otherwise, we would overwrite the entire prescreen buffer). CopyBits allows us to specify a copying mode that replaces a destination pixel only if the corresponding source pixel isn't equal to the background color of the destination graphics port. So, to successfully copy only the rendered 3D objects from the pixmap draw context to the prescreen buffer, we need to (1) make sure the background of the draw context is a known solid color that doesn't occur in any rendered pixels, and (2) make sure that the background color of the prescreen buffer is set to that same known color. The CreateDrawContext function defined in Listing 1 uses the constant kClearColor to set the clearImageColor field of the draw context data structure:

const RGBColor    kClearColor = {0x1111, 0x2222, 0x3333};

In our prescreen buffer imaging completion procedure, we call CopyBits as shown in Listing 2, having first set the background color of the destination port to the same color.

Listing 2

PrescreenRoutine selection
// get the current graphics world
// (on entry, the current graphics world is set to the prescreen buffer)
GetGWorld(&myGWorld, &myGDevice);

RGBBackColor(&kClearColor);

// copy the rendered image to the current graphics world;
CopyBits((BitMapPtr)&(*myAppData)->fPixGWorld->portPixMap,
     (BitMapPtr)&myGWorld->portPixMap,
     &(*myAppData)->fPixGWorld->portRect, 
     &myGWorld->portRect,
     srcCopy | transparent, 
     0L);

Camera

So, we now see how to get a QuickDraw 3D rendered image from its draw context into the QuickTime VR panorama as displayed on the screen. What we need to understand next is how to connect the 3D coordinate space used by QuickDraw 3D to the photographic space used by QuickTime VR. Clearly, if we place a stationary 3D object somewhere in a VR panorama, we'd like it to remain in that position while the user pans around, and we'd like it to get larger or smaller when the user zooms in or out. We accomplish this by connecting the user's panning, tilting, and zooming to corresponding changes in the 3D camera. This is in fact relatively simple, but we should first clarify the way that QuickTime VR handles its photographic data, since this process can be a bit confusing.

To create a VR panorama, the author takes a number of overlapping photographs, which are flat, rectilinear images. The VR authoring tools stitch these photos together into a single image and project the resulting image onto a cylinder. At runtime, QuickTime VR takes a portion of that cylindrical projection and projects it onto a flat surface (the user's monitor) to create another flat, rectilinear image. These two projections are intended to essentially cancel each other out, giving the user a flat view that is just like the image captured by the author's camera. In other words, the QuickTime VR runtime engine provides the functional equivalent of a viewfinder camera mounted on a tripod that can pan horizontally, tilt vertically, and zoom in and out.

Happily, QuickDraw 3D supports a type of camera that provides exactly these features, the aspect ratio camera. To configure an aspect ratio camera, we need to specify the camera's field of view and the horizontal-to-vertical aspect ratio of the view plane. The aspect ratio is easy to calculate: we simply take the ratio of the sides of the movie box:

myCameraData.aspectRatioXToY =
  (float)(thePort->portRect.right - thePort->portRect.left) / 
  (float)(thePort->portRect.bottom - thePort->portRect.top);

The field of view is also easy to determine, since we can just use the QuickTime VR field of view. The only "gotcha" is that the VR field of view is always the vertical field of view, whereas the 3D field of view is either vertical or horizontal, depending on whether the aspect ratio is greater or less than 1.0. (In other words, the QuickDraw 3D field of view is always in the direction of the smaller side of the view plane.) Listing 3 shows how we generate the QuickDraw 3D camera settings based on the current QuickTime VR settings.

Listing 3

SetCamera 
void SetCamera (WindowObject theWindowObject)
{
  ApplicationDataHdl    myAppData;
  TQ3ViewObject        myView;
  TQ3CameraObject      myCamera;
  TQ3CameraPlacement    myCameraPos;
  QTVRInstance        myInstance;
  
  if (theWindowObject == NULL) return;
  
  // get the QTVR instance associated with the specified window
  myInstance = (**theWindowObject).fInstance;
  if (myInstance == NULL) return;
    
  // get the view object associated with the specified window
  myAppData = GetAppDataFromWindowObject(theWindowObject);  
  myView = (**myAppData).fView;

  // get the camera associated with the view object
  Q3View_GetCamera(myView, &myCamera);
  
  if (myCamera != NULL) {
    float            myFOV, myPan, myTilt;
    TQ3Point3D      myPoint;
    TQ3Vector3D      myUpVector;
    
    // set the camera's field of view
    myFOV = QTVRGetFieldOfView(myInstance);
    
    if ((**myAppData).fQD3DFOVIsVert) {
      Q3ViewAngleAspectCamera_SetFOV(myCamera, myFOV);
    } else {
      float      myRatio;
      Q3ViewAngleAspectCamera_GetAspectRatio(myCamera, 
        &myRatio);
      Q3ViewAngleAspectCamera_SetFOV(myCamera, myFOV * 
        myRatio);
    }

    // get the camera's current pan and tilt angles
    myPan = QTVRGetPanAngle(myInstance);
    myTilt = QTVRGetTiltAngle(myInstance);
    // calculate the new point-of-interest
    myPoint.x = sin(myPan) * cos(myTilt) * k3DObjectDist;
    myPoint.y = sin(myTilt) * k3DObjectDist;
    myPoint.z = cos(myPan) * cos(myTilt) * k3DObjectDist;
    // calculate the new up vector of the camera
    myUpVector.x = -sin(myTilt) * sin(myPan);
    myUpVector.y = +cos(myTilt);
    myUpVector.z = -sin(myTilt) * cos(myPan);
    Q3Vector3D_Normalize(&myUpVector, &myUpVector);
    Q3Camera_GetPlacement(myCamera, &myCameraPos);
    myCameraPos.upVector = myUpVector;
    myCameraPos.pointOfInterest = myPoint;
    myCameraPos.cameraLocation = kCameraOrigin;
    Q3Camera_SetPlacement(myCamera, &myCameraPos);
    // update the QD3D camera
    Q3View_SetCamera(myView, myCamera);
    Q3Object_Dispose(myCamera);
  }
}

We call SetCamera in our prescreen buffer imaging completion procedure, if we determine that the pan angle, tilt angle, or field of view angle of the panorama has changed since we last set the 3D camera characteristics.

Action

So far, we've learned the fundamental steps required for integrating QuickDraw 3D with QuickTime VR panoramas: we see how changes in the VR environment are reflected in changes in the 3D camera, and we see how the rendered 3D image is embedded into the panorama. Now it's time to create some motion.

Of course, the simplest way to get objects to move around in a panorama is to move them in 3D space by assigning them new locations. Once an object is moved, we need to render a new 3D image and superimpose it on the panorama. This can be accomplished by calling QTVRUpdate to trigger our prescreen buffer imaging completion procedure.

Another way to create some motion is to rotate an object in place. Listing 4 shows a simple procedure that rotates an object about the z and y axes.

Listing 4

AnimateModel 
void AnimateModel (WindowObject theWindowObject)
{
  TQ3Matrix4x4          myMatrix;
  ApplicationDataHdl    myAppData;
  TQ3Vector3D            myVector;
  
  myAppData = GetAppDataFromWindowObject(theWindowObject);
  if (myAppData == NULL) return;
  // rotate the object around the global z and local y axes
  Q3Matrix4x4_SetRotate_Z(&myMatrix, kAnimateRadians);
  Q3Matrix4x4_Multiply(&(**myAppData).fRotation, &myMatrix, 
    &(**myAppData).fRotation);
  Q3Vector3D_Set(&myVector, 0.0, 1.0, 0.0);
  Q3Matrix4x4_SetRotateAboutAxis(&myMatrix, 
    &(**myAppData).fGroupCenter, &myVector, kAnimateRadians);
  Q3Matrix4x4_Multiply(&(**myAppData).fRotation, &myMatrix, 
    &(**myAppData).fRotation);
}

AnimateModel changes the rotation matrix of the 3D model; when the model is next rendered, its rotation matrix is applied to determine its current orientation.

At the Movies

Our final task is to apply a QuickTime movie as a texture to a 3D object. QuickDraw 3D supports texture mapping as part of its general shading architecture. In a nutshell, we need to create a new texture shader and attach it to the 3D model. The texture itself is simply a pixmap that is applied, during rendering, to the surface of the 3D object. So, we need to create a new offscreen graphics world that is the size of the QuickTime movie whose frames will be used as the texture. Listing 5 shows how to create a new texture from a specified movie.

Listing 5

NewTextureFromMovie
TextureHdl NewTextureFromMovie(Movie theMovie)
{
  unsigned long        myPictMapAddr;
  GWorldPtr           myGWorld;
  PixMapHandle         myPixMap;
  unsigned long       myPictRowBytes;
  GDHandle            myOldGD;
  GWorldPtr          myOldGW;
  Rect                myBounds;
  TQ3StoragePixmap    *myStrgPMapPtr;
  TextureHdl          myTexture = NULL;

  myTexture = (TextureHdl)NewHandleClear(sizeof(Texture));
  if (myTexture == NULL) return(myTexture);
  HLock((Handle)myTexture);
  // save current port
  GetGWorld(&myOldGW, &myOldGD);
  // get the size of the movie
  GetMovieBox(theMovie, &myBounds);
  // create a new offscreen graphics world (into which we will draw the movie)
  QTNewGWorld(&myGWorld, kOffscreenPixelType, &myBounds, 
    NULL, NULL, 0L);
  myPixMap = GetGWorldPixMap(myGWorld);
  LockPixels(myPixMap);
  myPictMapAddr = (unsigned long)GetPixBaseAddr(myPixMap);
  // get the offset, in bytes, from one row of pixels to the next;
  myPictRowBytes = (**myPixMap).rowBytes & 0x3fff;
  SetGWorld(myGWorld, NULL);
  // create a storage object associated with the new offscreen graphics world
  myStrgPMapPtr = &(**myTexture).fStoragePixmap;
  myStrgPMapPtr->image = Q3MemoryStorage_NewBuffer(
                        (void *)myPictMapAddr,
                        myPictRowBytes * myBounds.bottom,
                        myPictRowBytes * myBounds.bottom);
  myStrgPMapPtr->width      = myBounds.right;
  myStrgPMapPtr->height      = myBounds.bottom;
  myStrgPMapPtr->rowBytes    = myPictRowBytes;
  myStrgPMapPtr->pixelSize  = 32;
  myStrgPMapPtr->pixelType  = kQ3PixelTypeRGB32;
  myStrgPMapPtr->bitOrder    = kQ3EndianBig;
  myStrgPMapPtr->byteOrder  = kQ3EndianBig;
  (**myTexture).fpGWorld = myGWorld;
  SetMovieGWorld(theMovie, myGWorld, NULL);
  StartMovie(theMovie);
bail:
  SetGWorld(myOldGW, myOldGD);
  HUnlock((Handle)myTexture);
  return(myTexture);
}

Notice that we tell QuickTime to draw the movie frames into the offscreen graphics world by calling SetMovieGWorld. The only other thing we need to do is periodically draw the next frame of the movie into that graphics world by calling MoviesTask and then inform QuickDraw 3D that the texture pixmap has changed. The code in Listing 6 performs both of these operations.

Listing 6

NextFrame 
Boolean NextFrame (TextureHdl theTexture)
{
  TQ3StoragePixmap    *myStrgPMapPtr;
  long              mySize;
  TQ3Status          myStatus;
  if ((**theTexture).fpGWorld == NULL)
    return(false);
  HLock((Handle)theTexture);
  // draw the next movie frame
  if ((**theTexture).fMovie)
    MoviesTask((**theTexture).fMovie, 0);
  myStrgPMapPtr = &(**theTexture).fStoragePixmap;
  mySize = myStrgPMapPtr->height * myStrgPMapPtr->rowBytes;
  // tell QD3D the buffer changed
  myStatus = Q3MemoryStorage_SetBuffer(
      myStrgPMapPtr->image,
      GetPixBaseAddr((**theTexture).fpGWorld->portPixMap),
      mySize, mySize);
  HUnlock((Handle)theTexture);
  return(myStatus == kQ3Success)
}

We call NextFrame from our prescreen buffer imaging completion procedure, just before we render a new image into the pixmap draw context.

The Final Cut

Here, I hope you'll agree, we've done nothing too terribly difficult. We've simply used the standard, off-the-shelf, QuickDraw 3D APIs to generate an image that we overlay onto a QuickTime VR panorama, also using standard, off-the-shelf, APIs. The essential step is the manner in which we link the QuickTime VR pan, tilt, and zoom angles with the QuickDraw 3D camera settings: when the VR view changes, we simply alter the 3D camera accordingly and render a new image to superimpose upon the panoramas. The resulting effects, however, are extremely compelling, and provide one good method of integrating motion and sound with QuickTime VR panoramas. The reviewers give this technique two thumbs up!

Bibliography and References

  • Apple Computer, Inc. Virtual Reality Programming With QuickTime VR 2.0 (1997). Cupertino, CA.
  • Apple Computer, Inc. 3D Graphics Programming With QuickDraw 3D. (1995) Addison-Wesley, Reading, MA.
  • Fernicola, Pablo and Nick Thompson. "QuickDraw 3D: A New Dimension for Macintosh Graphics". develop, issue 22 (June 1995), pp. 6-28.
  • Fernicola, Pablo and Nick Thompson. "The Basics of QuickDraw 3D Geometries". develop, issue 23 (September 1995), pp. 30-51.
  • Fernicola, Pablo, Nick Thompson, and Kent Davidson. "Adding Custom Data to QuickDraw 3D Objects". develop, issue 26 (June 1996), pp. 80-98.
  • Monroe, Tim, and Bryce Wolfson. "Programming With QuickTime VR". MacTech, 13:7 (July 1997), pp. 43-50.
  • Schneider, Philip J. "New QuickDraw 3D Geometries". develop, issue 28 (December 1996), pp. 32-55.
  • Thompson, Nick. "Easy 3D With the QuickDraw 3D Viewer". develop, issue 29 (March 1997), pp. 4-26.

Tim Monroe, monroe@apple.com, is a software engineer on Apple's QuickTime team and is currently developing sample code for the new QuickTime 3.0 programming interfaces. In his previous life at Apple, he worked on the Inside Macintosh team.

 

Community Search:
MacTech Search:

Software Updates via MacUpdate

Latest Forum Discussions

See All

Top Mobile Game Discounts
Every day, we pick out a curated list of the best mobile discounts on the App Store and post them here. This list won't be comprehensive, but it every game on it is recommended. Feel free to check out the coverage we did on them in the links... | Read more »
Price of Glory unleashes its 1.4 Alpha u...
As much as we all probably dislike Maths as a subject, we do have to hand it to geometry for giving us the good old Hexgrid, home of some of the best strategy games. One such example, Price of Glory, has dropped its 1.4 Alpha update, stocked full... | Read more »
The SLC 2025 kicks off this month to cro...
Ever since the Solo Leveling: Arise Championship 2025 was announced, I have been looking forward to it. The promotional clip they released a month or two back showed crowds going absolutely nuts for the previous competitions, so imagine the... | Read more »
Dive into some early Magicpunk fun as Cr...
Excellent news for fans of steampunk and magic; the Precursor Test for Magicpunk MMORPG Crystal of Atlan opens today. This rather fancy way of saying beta test will remain open until March 5th and is available for PC - boo - and Android devices -... | Read more »
Prepare to get your mind melted as Evang...
If you are a fan of sci-fi shooters and incredibly weird, mind-bending anime series, then you are in for a treat, as Goddess of Victory: Nikke is gearing up for its second collaboration with Evangelion. We were also treated to an upcoming... | Read more »
Square Enix gives with one hand and slap...
We have something of a mixed bag coming over from Square Enix HQ today. Two of their mobile games are revelling in life with new events keeping them alive, whilst another has been thrown onto the ever-growing discard pile Square is building. I... | Read more »
Let the world burn as you have some fest...
It is time to leave the world burning once again as you take a much-needed break from that whole “hero” lark and enjoy some celebrations in Genshin Impact. Version 5.4, Moonlight Amidst Dreams, will see you in Inazuma to attend the Mikawa Flower... | Read more »
Full Moon Over the Abyssal Sea lands on...
Aether Gazer has announced its latest major update, and it is one of the loveliest event names I have ever heard. Full Moon Over the Abyssal Sea is an amazing name, and it comes loaded with two side stories, a new S-grade Modifier, and some fancy... | Read more »
Open your own eatery for all the forest...
Very important question; when you read the title Zoo Restaurant, do you also immediately think of running a restaurant in which you cook Zoo animals as the course? I will just assume yes. Anyway, come June 23rd we will all be able to start up our... | Read more »
Crystal of Atlan opens registration for...
Nuverse was prominently featured in the last month for all the wrong reasons with the USA TikTok debacle, but now it is putting all that behind it and preparing for the Crystal of Atlan beta test. Taking place between February 18th and March 5th,... | Read more »

Price Scanner via MacPrices.net

AT&T is offering a 65% discount on the ne...
AT&T is offering the new iPhone 16e for up to 65% off their monthly finance fee with 36-months of service. No trade-in is required. Discount is applied via monthly bill credits over the 36 month... Read more
Use this code to get a free iPhone 13 at Visi...
For a limited time, use code SWEETDEAL to get a free 128GB iPhone 13 Visible, Verizon’s low-cost wireless cell service, Visible. Deal is valid when you purchase the Visible+ annual plan. Free... Read more
M4 Mac minis on sale for $50-$80 off MSRP at...
B&H Photo has M4 Mac minis in stock and on sale right now for $50 to $80 off Apple’s MSRP, each including free 1-2 day shipping to most US addresses: – M4 Mac mini (16GB/256GB): $549, $50 off... Read more
Buy an iPhone 16 at Boost Mobile and get one...
Boost Mobile, an MVNO using AT&T and T-Mobile’s networks, is offering one year of free Unlimited service with the purchase of any iPhone 16. Purchase the iPhone at standard MSRP, and then choose... Read more
Get an iPhone 15 for only $299 at Boost Mobil...
Boost Mobile, an MVNO using AT&T and T-Mobile’s networks, is offering the 128GB iPhone 15 for $299.99 including service with their Unlimited Premium plan (50GB of premium data, $60/month), or $20... Read more
Unreal Mobile is offering $100 off any new iP...
Unreal Mobile, an MVNO using AT&T and T-Mobile’s networks, is offering a $100 discount on any new iPhone with service. This includes new iPhone 16 models as well as iPhone 15, 14, 13, and SE... Read more
Apple drops prices on clearance iPhone 14 mod...
With today’s introduction of the new iPhone 16e, Apple has discontinued the iPhone 14, 14 Pro, and SE. In response, Apple has dropped prices on unlocked, Certified Refurbished, iPhone 14 models to a... Read more
B&H has 16-inch M4 Max MacBook Pros on sa...
B&H Photo is offering a $360-$410 discount on new 16-inch MacBook Pros with M4 Max CPUs right now. B&H offers free 1-2 day shipping to most US addresses: – 16″ M4 Max MacBook Pro (36GB/1TB/... Read more
Amazon is offering a $100 discount on the M4...
Amazon has the M4 Pro Mac mini discounted $100 off MSRP right now. Shipping is free. Their price is the lowest currently available for this popular mini: – Mac mini M4 Pro (24GB/512GB): $1299, $100... Read more
B&H continues to offer $150-$220 discount...
B&H Photo has 14-inch M4 MacBook Pros on sale for $150-$220 off MSRP. B&H offers free 1-2 day shipping to most US addresses: – 14″ M4 MacBook Pro (16GB/512GB): $1449, $150 off MSRP – 14″ M4... Read more

Jobs Board

All contents are Copyright 1984-2011 by Xplain Corporation. All rights reserved. Theme designed by Icreon.