TweetFollow Us on Twitter

Serious About MPW
Volume Number:3
Issue Number:4
Column Tag:Advanced Mac'ing

Getting Serious About MPW

By Frank Alviani, Chief Tool Maker, Odesta, Inc., MacTutor Contributing Editor

Introduction

We had all just hunkered down into our seats with some fresh coffee when The Boss wandered in to get the department meeting going. He had been talking a lot about "improving our productivity" lately, and word was going around that we were soon going to be switched from the Good Ol' Lisa to MPW.

"Alviani", he started off with (a bad sign for sure), "you've got a lot more experience with different environments than most of the guys here" (and a lot more gray, I thought to myself), "and you spend your time building tools for the rest of the staff, so I've decided to let you be the first to switch over to Apple's new MPW system. I want you to figure out how to actually use the dang thing, which the manuals sure keep me from finding out. Find the bugs, build procedures, squeeze the daylights out of it..."

Great, I thought to myself above the rising crescendo of patriotic brass band music, I get to play human mine detector......

MPW (which stands for Macintosh Programmers' Workshop) has recently become available to the world thru APDA, and, like most revolutions, has polarized most people into 'fer it' and 'agin it' camps. I have no intentions of trying to carry out religious conversions in this article; instead, I'd like to spend some time clarifying the (often excessively terse) MPW reference documentation and providing some "hints and tricks" material on how to use it.

I should state before going any further that the documentation that comes with MPW makes no pretense of being anything other than reference material, which is not terribly unreasonable: the MPW and assembler manuals are already 2.5" thick and weigh 5 lbs. The Pascal and C manuals are each about 1" thick. Thus, if you have the entire package, there's 4.5" - 5" of reading ahead of you (add another 1" for MacApp). Including tutorials for each component would require Apple to ship MPW by truck. (Besides, I'd have a much worse chance of getting Dave Smith to print this article if there were first-rate tutorials included....)

Also, let me emphasize that I assume you have the MPW documentation to refer to when reading this - I am trying to clarify the reference material, not repeat it. I am working from version 1.0, APDA draft of October 3, 1986 (the latest shipped from APDA at the time of writing).

What is MPW and who is it aimed at?

MPW is an easily extensible integrated environment, inspired by the famous UNIX programmers' workbench. The Think Technology products, Lightspeed C™ and Lightspeed Pascal™, are also integrated environments, although language- specific ones.

The MPW environment, like the UNIX workbench, is command-line oriented, with the ability to specify options on the command lines, etc. While this may seem like a reversion to a more primitive time, there are several advantages to this approach when programming:

(1)The method of executing commands is quite flexible. You can select any text in any window (using standard editing methods) and simply execute it. The "Instructions" files that come with the various language packages, for example, use this approach by embedding commands in the explanatory text and telling you to select and execute them to build a specific program.

(2)Unlike UNIX, the worksheet that you usually work from is a normal text window. This means that a number of commands can be typed and checked before execution. Commands that you need repeatedly are simply left in the window and re-selected when needed again. If you need to execute a command repeatedly with a set of run-time options, they are set once when typed on the command line, rather than being selected each time from the menu bar.

Shell variables and command files allow MPW to be totally and precisely configured for a specific purpose, and changed easily when needed. User-built Tools, properly configured, have all the freedom of those shipped with MPW, and allow it to be extended virtually indefinitely.

MPW is aimed at larger, multi-file projects, rather than at quicky, "one-off" programs, where it would be overkill. There is definitely a learning curve associated with MPW, but the results allow you to automate most of the mechanics of working with large projects, freeing you to concentrate on the actual programs (which is already complex enough). In addition, once you've gotten the hang of writing command files, etc., there is a certain glee in watching them execute that's like watching a piece of Victorian clockwork in action!

In short, MPW is rather complex for somebody trying to learn the Macintosh, but is extremely nice for somebody already familiar with the Mac. Once you've worked with MPW for a while, it becomes quite comfortable, and, like an old slipper, fits better the longer you wear it.

Good and bad points of MPW

There are plenty of both, as this is a big product (13 400K disks and 1675 pages of documentation if you get all components).

Good Points:

Totally configurable. The combination of command options, command files, tools, and the ability to build custom menus allows you to set up MPW exactly the way you want for any purpose, and to change it at any time.

Excellent help facilities. While they won't replace the documentation, it does a good job of explaining exactly what options are for each command, the meaning of the various special characters used in regular expressions, etc. You can enter Help commands to get a list of commands available, each followed by a comment about their function; typing Help before a specific command in the list and hitting the enter key then gives you the details for the requested command.

Powerful programming facilities in the shell. The shell includes a complete set of control structures so full programs can be built in the command language. Looping, branching, and begin-end structures are provided; when combined with variable expansion , dialog boxes, etc., quite powerful tools can be built quickly.

Powerful command files possible. Any text file can be executed as a command, so that complex sets of commands such as mentioned above can be put into "macro" files and executed whenever desired, just as if they were regular commands built in. These, of course, can refer to other command files, so that capabilities are almost unlimited.

The entire external environment is available to tools. This means that options can be put in variables that are globally available, for example, and accessed by a tool. This can make life easier in some situations.

The assembler is extremely powerful. The macro capabilities of the assembler are explicitly modeled after those in the IBM 360/370 assembler, which is widely regarded as being absolutely first-rate. Sophisticated parameter usage, code optimization(!), string manipulation and conditional assembly commands are waiting for you. The ability to declare data structure templates in a way very similar to that of C or Pascal simplifies life greatly, and "object-oriented" features make it easier to work with MacApp or other object-type environment.

Sophisticated compilers are available. While the Pascal compiler seems to still have a few bugs, the code produced is quite good. The C compiler also seems to produce excellent code. Both include the tools necessary to write desk accessories and MPW tools easily, without clumsy workarounds. The best environment around is worthless if the compilers aren't reliable or if they produce mediocre code.

It opens object-oriented programming technology to the average programmer. Object-oriented programming is looking like a very promising way of handling large, complex programming projects (which includes 'most anything on the Mac, it seems). Estimates of productivity improvements have ranged as high as five-fold. By taking this technology out of the academic world and making it widely available, Apple should help Mac programmers produce the best programs in the world at an even faster clip.

It is looking towards the future. The assembler, in particular, explicitly includes support for: 68000/68010/68020/68030 processors, the 68881 floating-point math unit, and the 68851 paged memory-management unit, important considerations now that the Mac II is available.

Bad Points:

Complex to learn. Since there is a large command set, with many options possible for most commands, it takes a while to learn how to use it, and longer to learn how to use it well.

Not as fast as some integrated environments. It does not appear to be as fast as the Lightspeed environments, which could be a real problem when dealing with very large projects.

Only reference documentation is provided. Which is why this tutorial exists at all. Much hacking and experimentation is required to learn to use MPW at all reasonably.

A tool or command file can't affect a variable defined outside itself. This complicates the process of returning information to a caller. Work-arounds are usually, but not always, possible.

Hints and Tricks

Now that I've gone ahead with a mini-review of MPW, let's get down to some day-to-day details of using it that aren't included in the documentation.

Programming the shell: This is the area that is most unlike the regular Macintosh environment. Using typed commands isn't totally alien to most of us, but the ability to build your own commands is limited in most systems. The closest experience for most people would be Red Ryder macros.

In the following examples, the actual MPW commands will always be set in a Mono-spaced font, like this, so they are easily distinguishable from the rest of the text (this is the same convention used in Apple tech notes).

• The mysteries of quoting. The use of the various quotation marks is not at all obvious from the MPW manual. Some rules of thumb about quoting are:

(1)There is a definite hierarchy of "quote-mark strength". The ordering of the different quotation marks is:

(a) `...` - execute enclosed statement and replace with it's output. strongest

(b) '...' - take contents literally (no substitutes)

(c) char - take char literally within "..."

(d) "..." - variable substitutions, etc. occur

(e) /.../ and \...\ - regular expression quotes. weakest

(2)The various quotation marks, unlike parentheses, do not "nest". Therefore, you can only use one set of each kind in an expression, except for the "delta" form char, which you can use anywhere since it doesn't occur in pairs. That is,

set var "This is a "quote"" #this won't work at all
set var "This is a "quote"" #check out - shouldn't work
set var 'This is a "quote"' #this should work

which means that you may sometimes find it necessary to define a helper variable using the stronger quotes, then expand it inside weaker quotes. For example, see the section talking about the differences between a command built directly in a menu, and commands built in command files and executed from a menu; the quotation rules cause the major difference.

• shell variables. While shell variables can hold simple abbreviations or numeric values, there are other uses for them. A powerful use is to hold commands that can be used with variable expansion to assign the results of a dialog to a variable. For example:

set ask "request 'Resource ID?'" #command - ask for ID #
set res "`{ask}`"#res := output of the #request command

The "soft quotes" are used in the 2nd line as a safety precaution. The shell will (1) replace {ask} with the request command [including the hard quotes], (2) execute the request and replace the command with whatever is typed in by the user, and (3) assign the command-replacement to the variable "res" (the command-replacement could include blanks, hence the "soft quotes").

This technique must be used when building commands as part of a menu item and you wish to use any commands that need hard quotes (there's an example coming up shortly). It is also handy when you have a group of command files that all need to use the same command; by exporting the variable containing the command, any command can use it.

• Assigning selections to variables: This is well buried in the manual, and is not in the least obvious (thanks to Darin Adler for pointing it out!). The trick is to use command substitution with the catenate command, which can take a selection in a window and output it to StdOut. Thus, the technique is:

set var `catenate "{window}".§`

• Concatenation: While this is not spelled out in the manual, it turns out to be pretty trivial. The key concept is that any variable expansion takes place within the quotes before the assignment is executed. The following set of commands will define a variable (we'll use it as part of a path name), and then concatenate it in various ways:

set nm Partial #define nm to the value 'Partial'
set pre C:Sources#define directory prefix
set nm "{Pre}:{nm}"#nm now is "C:Sources:Partial"
set nm "{nm}.c"  #nm now is "C:Sources:Partial.c"

• Insertion and deletion. As with other parts of MPW, there are usually several different ways to do these operations; the techniques differ according to whether the clipboard is used or not. To illustrate deletion, see fig. 1 below.

• Building shell command files. This is one of MPW's greatest strengths. Shell command files allow you to extend MPW infinitely in any direction you care to take it. The Make utility, for example, creates a shell command file as its output; when that command file is executed, exactly what is needed to rebuild the application is carried out without further human intervention.

Command files are programs, and need to be treated as such. You have variables, conditional execution, and powerful string facilities at your fingertips; it is also possible to debug command files pretty easily once you have done one or two.

The easiest way to debug command files is to turn on the 'echo' variable at the start of execution, using the statement:

set echo 1

but this can have strange effects, depending on how your command is written. Especially when you are doing pattern searching and replacing, the command should be written to execute in a different window than the one from which it was invoked (typically the target window), so that the output of the command echo doesn't interfere. A little experimentation will make this quite clear. Another useful technique, especially when trying to figure out exactly what the result of some regular expression is, is to "single- step" by using exit commands after each expresssion: see what happens, adjust until it works as desired, move the exit to the next command....

• Building your own help files for those command files. The help command can be passed the name of a file to use, which means you can write custom help files for complex command files if necessary. Using menu commands to build in the name of the help file could simplify matters even more. Fortunately, the format of the help file (which is an ordinary text file) is extremely simple:

(1) Entries are separated by a line containing a '-' in the 1st column.

(2) The 1st word on the line following a separator line is the key looked for by the help command. Everything between separator lines is displayed.

• There are slight differences between commands added to menus and the same set of commands executed from a command file. The basis for these differences is the MPW quoting process. The syntax of the AddMenu command is not complex:

AddMenu menuname itemname command

which you'll notice implies that a single item follows the item name. Therefore, to include any useful command, it must be enclosed in quotes (normally the "hard quotes": '..'). Since you can't nest quotes, anything which would be enclosed in hard quotes in a command file can't be so quoted in the AddMenu command, but must instead be defined as a helper variable and expanded when the addmenu command is executed.

Also, menu commands need to have their variables declared in the program doing the 'AddMenu' (usually User Startup) and then exported (due to scope rules) whereas command files can declare their own variables. Trying to declare variables within a command attached to a menu doesn't work. Such a command must use "external" variables, which must be exported from the defining program to be available to it. The normal way of adding a complex command to a menu from User Startup becomes:

# various other commands.....
set x xxx; export x
set y yyy; export y
AddMenu NewMenu 'NewItem/n' 
'begin ;
 set x {x}{y}; echo {x} ;
 end'

Notice here that the actual text of the command is enclosed in the hard quotes, and that lines are always ended with -newline (nothing following the newline!!) so that each line of the command is concatenated to its predecessor, resulting in the required single item.

• Regular expressions and navigating thru the MPW universe. Regular expressions are how you maneuver through a file. It can take a while to get used to this if you are used to a position-oriented macro editor such as MEdit.

The power of the MPW regular expression mechanism lies in the fact that you have the ability to both find a specified pattern of characters, and also to work relative to that location, in terms of lines and characters. You can either set the editing position (insertion point) to a given location, or set the extent of a selection.

The fundamental approach to take in many cases is to use the pattern- matching powers for "coarse navigation" and then to use the positioning capabilities to set the selection or position exactly. Typically, you might search for "any string with $ on the left and . on the right", then extract "the selection after the $ and before the ." (getting the dollar amount from a formatted entry) although this example would be done in a single statement:

find /$/:/./ "{active}"#extract dollar value

A common need is to copy an entire line, without the final newline. This is where the combination of pattern and positioning is the only easy way to do it:

find /•?/:/n/  #find from 1st char up to the position  #before the 
newline

Another common need in command files is to be sure you're at the start of the current line before processing begins. This is a situation where the fact that the positioning commands are evaluated with respect to the immediately preceeding subexpression is important (and non-obvious); the following code first selects zero lines before the current line, then sets the position to zero characters before it:

find ¡0¡0 "{active}"     #insertion pt @ start of this line

Also, as mentioned in the documentation, regular expressions are used for parsing text. In combination with the tagging operator ® (used in the evaluate command as shown in its writeup) you can extract very particular pieces of complex strings for later processing.

Fig. 2 Scope of common variables

Pay attention to the priority of the operators used in regular expressions. Just as in programming in 'C', most expressions behave the way you expect them to, but the ones screwed up by the different operator priorities can be infuriating to debug!

• How to get the effects of common variables to a limited extent (which the shell ordinarily prevents). The fact that it is impossible for a command to affect a variable which was defined outside itself prevents the use of variables to pass information between commands. See the diagram below that illustrates the problem (modeled after the figure in the documentation).

It is possible to get some communications by using a scratch window to exchange info between command files. The technique is to maintain a string in a scratch window that can be worked with using replace, find, etc. In order to allow a number of command files to share a window without interfering with each other, I use the "keyword variable" approach: the string used to communicate between files is of the form keyword = value. Unique keywords pretty much ensure that different programs will remain isolated. (See fig. 3.)

Fig. 3 Using Scratch Files

It is also possible, if there are only a small set of possible values for a "variable", to get some of the effects of a finite-state machine, by taking advantage of the fact that a replacement operation returns a status that can be tested. If you are working with a "finite state machine" approach, you need 1 program block for each state (here in pseudo-code):

- replace mode_n with next_mode

- if the replacement succeeded, you were indeed in mode_n, and can do whatever you need to. End the program block with "Exit 0" so that the calling program can continue normally.

- if the replacement failed, you were in one of the other modes, so fall thru to try the next one (the "keyword variable" will be unchanged by the attempt).

- after the last state, reset the string to mode_1 so that you can continue the cycle again.

Fig. 4 Shell Communication

Here is a complete command file for a glossary function that shows how it is done (state-saving code is in boldface):

# This is a 1-key glossary function
# To use: invoke it, and Ÿ will appear. 
#type the shortcut
#invoke it again. 
# The shortcut will be expanded in place
# Shortcuts that aren't found are simply deleted, 
# along with the Ÿ and ¥ bracketing them.

#first-time setup test
find • "{scr}"; find /sel=/ "{scr}"   
if {status} != 0  #test variable state    
echo -n "sel=0" > "{scr}" #make sure of starting state
end 

#were we in 'tag shortcut mode'?
find • "{scr}"
replace /sel=0/ sel=1 "{scr}"#move to 'expand' mode
if {status} == 0#we were in "tag" mode
 replace § Ÿ "{active}" #tag start of shortcut
 exit 0 #exit normally
end

# replace failed - must have been in 'expand mode'
replace § ¥ "{active}"  #tag end of shortcut
find \Ÿ\:/¥/ "{active}" #find shortcut (without tags)
set xx `catenate "{active}".§`#save selection in variable
find • "{gloss}" #start at beginning of window
find /{xx}/ "{gloss}"#find shortcut
if {status} == 0
 copy /‘/:/’/ "{gloss}" #copy expansion that follows   paste \Ÿ\:/¥/ 
"{active}"#replace shortcut (with tags)
else
 replace \Ÿ\:/¥/ '' "{active}"#delete shortcut (with tags)
end
echo "sel=0" > "{scr}"#back to 'tag shortcut mode' 

Note, however, that the lines setting the state into 'tag shortcut mode' will replace the entire contents of the scratch window. If you are using the scratch window for several strings, you should replace them with:

replace § "sel=0" "{scr}" #set-up:just insert
...
find • "{scr}" #back to 'tag shortcut mode'
replace /sel=1/ "sel=0" "{scr}" #leaves all else untouched

• Pipes , redirection of I/O, and conditional execution. These are features adapted from Unix which can be very powerful once they are understood (the pipes/conditional-execution writeup is buried in chapter 3 under "command terminators"). These features are all implemented by the Shell.

In a true Unix system, a pipe can be either used to communicate between commands or used as a "data area" to communicate between tasks while they are each executing, in a multi-tasking mode. Since the Mac is a single tasking system, MPW pipes are only available between commands, and are essentially anonymous files with implicit I/O redirection. Pipes allow you to combine several commands without having to worry about creating intermediate work files. A common use of this is to follow a command such as files by another command to do something with the list of filenames produced ("list all pascal files | count lines | sort into ascending order by linecount to impress the boss").

I/O redirection is one of the most powerful features of the MPW shell from a programmers viewpoint. Because of it, you can write your tools to simply read from "standard input" and write to "standard output" without worrying about the actual source of the data. The USER decides whether to type at the keyboard, send in input from a file, or even use a selection in a window! Similarly, the output can go to any window or file. It is this I/O redirection that allows you to use the catenate command to assign the contents of a window selection to a variable! (See figure 4 below.)

• A useful trick with the Count command. An example of the power of combining MPW tools is this command to provide line/character counts for all the pascal source files in a project folder:

Count `files HD20:project: .p`

which will list the counts one per line. It works because the output of the files command is substituted for the command itself before the Count is executed.

Command file examples

A fairly complex set of command files accompanies this article. One set creates a menu that allows you to build Rez input files using interactive prompts for any resource type needed, to avoid the typically obscure syntax errors that can easily creep in, and to make sure all the parameters are in the right order, etc. Another file implements a glossary function allowing shortcut replacements of unlimited size that shows a technique for passing information between command files, or retaining variable values between invocations of the same command file.

I won't go into detail about every command file (having gotten a technique functioning, I tend to work it to death), but there are some points worth going into as examples. Let's start with the UserStartUp file, where the process of adding menus begins:

The basic structure I use when adding menus is as follows:

Confirm 'Do you want menu x'
if {status} == 0 #yes I dobegin    
 set br 'BoundsRect (t,l,b,r)?'  #for expansion later
 export br#so nested file sees it  #define more commands, etc. 
 
 "HD20:MPW:doRez"#this has the AddMenusend
end

A command file is used to actually add menu items to keep the command file size under control and so that the menu is easy to change: I can simply change HD20:MPW:doRez and re-execute it to update the whole menu at once. A few commands that are used in menu items directly are defined here.

There is no fixed rule as to when a function can be implemented directly as part of an AddMenu command and when it is better set up as a separate command file, which is executed by the menu command. I simply put the more complex functions in separate files, to keep things simple and to ease debugging. My cutoff point is about a dozen lines.

• The 'doRez' file, that actually builds the menu: This is a mixture of items that directly do something, and items that execute other files. Notice that in the "direct execution" items, every time we need command sub- stitution, it is done by variable expansion, due to quoting restrictions:

set str "`{st}`" #expand st into request, execute, assign      #the command 
output to variable {str}

Also notice (in the definition for DITL, for example) that I don't use the "execute" command, but simply give the complete command file name. This is because the execute command doesn't pass parameters through, which we need here. The "getResAttrs" routine is a good example, by the way, of how MPW can be extended almost indefinitely: every resource-type-building command invokes it to get the ID# and other attributes of the resource under construction.

• The "doAlert" command, that builds an Alert definition. This shows the general approach taken in most of the code to build individual Rez def- initions. Here is the actual command file, with notes following:

echo "/* `request 'Purpose:'` */" #what is definition for?
"HD20:mpw:macros:getResAttrs" ALRT #(Note 1)
set rr "`request 'BoundRect? (t,l,b,r)'`"    
#prompt for order of coordinates
set st4 "`request 'St4- 
 OK|Cancel,invisible|visible,beeps|silent'`" #(Note 2)
#expand abbreviation for stage 4
if {st4} =~ / [Oo] / ; set s4 OK ; #(Note 3)
 else ; set s4 Cancel ; end
if {st4} =~ / [Ii] / ; set s4 "{s4},invisble" ;    #(Note 4)
 else ; set s4 "{s4},visible" ; end
if {st4} =~ / [Bb] / ; set s4 "{s4},beeps" ;
 else ; set s4 "{s4},silent" ; end

#use stage 4 values as defaults for following stages
set s3 "`request -d {s4} 'St4-
 OK|Cancel,invisible|visible,beeps|silent'`"
set s2 "`request -d {s4} 'St4-
 OK|Cancel,invisible|visible,beeps|silent'`"
set s1 "`request -d {s4} 'St4-
 OK|Cancel,invisible|visible,beeps|silent'`"

#output filled-in definition
echo "{{rr}},t/* rect */"#boundary rectangle
copy •:/n/ "{scr}" #(Note 5) get saved id
paste § "{active}"        #insert at current insertion point
echo ",t/* DITL */"  #explanatory comment
echo "{s4},t/* stage4 */"#settings for each stage
echo "{s3},t/* stage3 */"
echo "{s2},t/* stage2 */"
echo "{s1} t/* stage1 */"
echo "};"#finish up definition

Note 1:

The subroutine not only gets the resource ID# and its attributes (defaulting to "Purgeable"), but also writes the resource ID to a scratch window, from which it is retrieved later. Since it is customary for DITLs to use the same ID as their DLOGs and ALRTs, this avoids asking the user for the ID twice.

Note 2:

Normally, you'd only enter the 1st letter for each choice, such as OVS (OK,Visible,Silent).

Note 3:

This uses pattern matching to expand the abbreviation into a form Rez understands. I use character sets [Oo] so it is not case-sensitive. The s bracketing the desired character sets are needed to allow for blanks, etc.

Note 4:

A classic example of concatenation, adding the new parameter onto the existing string. Once this series of tests is done, the expanded form becomes the default for the following stages.

Note 5:

Here is where we get the DITL ID that was written out for us by getResAttrs from the scratch window. While we can't pass variables between commands directly, this works pretty well.

• The "doMenu" command that builds an entire menu definition. The actual program is listed later, being too long to include "in-line". This is a little different from other command files in that it includes an inner loop used to build the definition of each menu item. The loop terminates if there is no title entered, or if the cancel button is pressed. Each item is output immediately once all the required information has been entered.

Notice that as the very last act, the command deletes the ; following the last menu item. No ';' may precede the '}'s that finish a Rez definition. You will find that all the commands given are careful about this, since Rez will object violently to extra semi-colons.

Programming the Make Utility

This is one of the most obscure and non-obvious, yet powerful and convenient, facilities in MPW. In some respects, it is the very heart of the system, since it can totally automate the process of building a program (or several programs at once, for that matter) once the command file is set up properly. When you are dealing with programs that can exceed 100,000- 200,000 lines (as we do where I work), Make isn't a luxury - it is an absolute necessity. Any program can, in fact, use Make effectively, since it can build everything automatically, including running the resource compiler Rez. (RMaker was notorious for often not cooperating with Make utilities, not having been written with them in mind).

The aim of a Make command file (called a makefile from now on) is to describe the exact way in which each file in it depends on other files, and to provide Make with the commands necessary to rebuild it when necessary. These command can either appear explicitly in the makefile, or can be constructed from default rules you write into the makefile or from defaults built in to Make itself. The result is generally a tree structure: an application depends on certain object files, which depend on certain source files, etc.

• Order of processing makefile commands. The order in which Make processes the contents of a makefile, and any default rules built into Make itself, is important, since it determines the order in which the output commands are executed. The basic principle is fairly simple: commands progress from "highest level" to "lowest level", in a "last-in, first-out" fashion. When the makefile is processed from start to end, the last-update time for each "target" file is checked; if any of the files on which it is dependent are more recent, then the commands needed to rebuild it are (essentially) added to an output queue.

When the whole file has been processed, the queued commands are popped off to the output file. When the commands needed to rebuild a file are output, they are written out in they same order you wrote them. The normal order of makefile commands is:

° Directory dependency & default rules

° "Rez" code & resources together into an application

° Link object files into a code resource file

° Compile all needed source files into object files

° Any "independent" commands needed, such as for rebuilding a "dump" file

• "Directory dependency rules" and the default rules. The documen- tation manages to utterly obscure how these interact. The aim is to provide a simple way to have all your object files in one folder, sources in another, etc. and still have the default build rules apply.

The built-in default rules separate the path names into directory and file components, by using variable substitution. Directory dependency rules are needed to specify exactly which directories are to be used for source and output files when default rules are used to generate commands in the output command file. They are required if all of the files for a project (source, object, resource, etc.) are to be separated in any fashion and the default rules are used. For example:

frank:mapper:obs:ƒ frank:mapper: #output ƒ source
.p.o    ƒ .p
 Pascal {DepDir}{Default}.p -o {TargDir}{Default}.p.o

#all object files will end up as frank:mapper:obs:file.p.o
#all source files come from frank:mapper:file.p

• How to set up a makefile so execution terminates after the compilation phase if any compiles failed. The basis for this "conditional" capability is buried in note #7 about Make's makefile, but one qualification is not mentioned: the shell variable {exit} must be set to 0 to ensure that the command file aborts after any compilation fails. In fact, an even more convenient setup is possible with little effort

The property of MPW when {exit} is non-zero ("continue no matter what!") that interferes with a clean conditional build ability is simple: recompiling or relinking an existing file only updates the existing file, rather than deleting it and writing a new copy on successful completion. This means that without precautions, a compile that fails will leave its object file in existance, so the link will spuriously complete...the Rez command will complete...and you don't really have a copy of the application!

We are going to take advantage of the fact that the compilers return status values just like other commands do, in {status}. The approach is to begin execution of the generated command file with {exit} set to non-zero. Thus set, every compilation will take place, even if some fail. As each compilation finishes, the value of {status} that is returned is added to a "status accumulator". When the link step is about to begin, the value of the "status accumulator" is tested, and we exit if it is non-zero (because at least one of the compilations failed). Here is a fragment of a makefile that shows how it's done:

Mapper.code ƒ  {ob}Boxes.p.o and others......      exit {total} if {total} 
  0#abort if any compiles diedLink {ob}Boxes.p.o and many others......

{ob}Boxes.p.o  ƒ Boxes.p  Globals.pPascal {Poptions} Boxes.p   
 set total `evaluate {total} + {status}`

In the fragment, total must not be externally defined and exported!! The variables you are going to use must be totally local to the Makefile. If you read the documentation very carefully, and experiment, you will find that any variables that are user-defined outside of Make are substituted when the output file is being written; that is, their values are written into the output file, rather than the variables themselves. This "build-time" substitution is why {Poptions} is usually defined in the user startup file and exported.

So far, I have generally used explicit commands in putting together makefiles in this fashion. Default rules work generally in the same way, but require caution, as explained below.

A segment using default rules looks very similar, except that there wouldn't be commands for each individual file:

.p.o    ƒ .p
 Pascal -o {DepDir}{Default}.p.o {TargDir}{Default}.p
 set total `evaluate {total} + {status}
 #Default applies to all source files
 #no object file if compile fails

A major caution here: be extremely careful in specifying your file dependencies!! It is a good idea to explicitly state that an output file is dependent both on its source and all included files. This is important with files that declare global variables, and may not be obvious (it wasn't to me!)

While the linker deals with globals by name, offsets within records are numeric in nature. If you modify a record definition in an included file, for example, and the make file doesn't record that your source is dependent on that included file, the source file won't be recompiled using the new definition, resulting in inaccurate record offsets and very hard-to-track- down bugs!

While default rules do work well in some circumstances, they must be used with care. Since I include files that define record types, I have gen- erally used explicit dependencies. This is more typing, but safer.

• Make can carry out any commands you specify - other things you can do with it beyond simple compilations. For example, a "dump" file is dependent on the files specified in the $LOAD directive that created/uses it. Should any of the files included in it change, it should be deleted so a new version can be rebuilt by the first module using it. This is easily done as follows:

HD20:obs:inst.dumpfile  ƒ Memtypes.p Other.p
 delete -y HD20:obs:inst.dumpfile  #force rebuilding

• How to time your build runs. The following sequence will not only build your project, but also time how long that took. It is set up to be a menu command:

AddMenu Nifties 'Make Project/0' 
'begin;
 directory HD20:Project ;         #switch to proper directory
 open -n scratch;#create/open temp window
 echo `date -s` > scratch;#write starting time
 make -f makeproject > MakeOut ;  #create make command file
 MakeOut ;#actually build project
 echo `date -s` >> scratch; #ending time on new line
 echo "status={status}" >> scratch   #final status, too
 end'

Other Components of MPW

Rez and DeRez are very powerful programs: Rez is a complete replacement for RMaker, with considerably improved powers over the MDS version, such as conditional execution, variable substitution, etc. As in the Lisa environment, Rez normally includes the CODE resources from the linker into the final executable program file.

Rez somewhat resembles C declarations, and it has "pre-processor" capabilities also similar to C. A major improvement is that you now have the power to define your own resource templates, so that it becomes possible to define complex resources in a very readable format, rather than in an endless stream of hex digits.

In larger projects, the #include statement becomes very useful. The resources for a program can be split into a number of different files (by type, for example) and the actual input to Rez becomes a short file including all the others. As usual, the advantage of this approach is that it is far easier to edit the individual files.

Be careful of the difference between the #include and include statements (note that the latter does NOT start with #). The #-less form actually reads resources into the output file; that is how, for examples, pictures can be easily compiled in, without ever having been converted to hex. (See Note 1 at the end for the details)

• On the subtleties of Rez syntax: There are a few points of Rez syntax that can be frustrating to master, most especially the placement of braces and semicolons. The manual is terse, and seems to avoid examples of the type of statement that will cause you the most trouble. The basic hassle comes in constructing arrays. A few rules of thumb that will help:

(a) Switches explicitly include the case label, followed by the components of the entry within braces. For example, the definition:

 
case Button:booleanenabled, disabled;key bitstring[7] = 4;     
 pstring;

has two components (boolean and pstring) that actually are defined by you. The format of the entry that is fed to Rez is:

 Button {enabled,'OK!}; 

(b) Arrays are enclosed by an outer set of braces. Each element of an array is separated from the next by a semicolon. Each element can consist of any number of pieces separated by commas. No semicolon before the last brace! For example:

 { start of array
 1,2,3; 3 pieces, element #1
 4,5,6  3 pieces, element #2 (note no semicolon!)
 } end of array

(c) Remember that points and rectangles, altho enclosed in braces, count as 1 "piece" in an array definition.

DeRez is the complement of Rez (as if you couldn't guess!) : feed it a file containing resources and get back Rez input to recreate those resources. You would normally use the Apple-supplied resource description files to provide the format information needed.

Tools and the integrated environment: a great feature of MPW is the support given for building additional tools that can be automatically integrated into the existing environment. Using the facilities built into the environment can allow a tool to be extremely flexible and powerful without adding burdensome complexity for the programmer.

The linker and library facilities: On the whole, the linker is quite powerful and flexible, but there are certain things about the MDS linker that are missing and would be kinda nice to have back (which I'll go into at the end of this section).

In a move aimed (I think) at encouraging third-party vendors to at least provide a version of their development products that can integrate into MPW, Apple has published the exact details of the linker object-file format. This openness should ensure that outside languages and packages gradually become available in the MPW world, allowing each programmer to mix and match pieces to get the set of features she/he wants.

The linker does quite a lot of work in the processing of building a program. It has optimization features which can help to shrink a program by:

-building the smallest jump table possible

-eliminating dead code and data modules

-changing the code it's processing to use the most efficient addressing modes possible (A5 relative between segments, PC relative within segments).

Combined with the library program, a program can be shrunk quite a bit and the linking process made noticably faster. There isn't much to be said about actually setting up a link command; the manual is pretty clear. The main reason the process is straightforward is that you have almost no control of how a program is segmented in the link command.

I have decidedly mixed feelings about the method used in MPW to set up the segmentation of a program. Pascal uses {$S segname } directives embedded in the source files; C uses #define _ _SEG_ _ segname. This is designed to allow you to write related functions together in a single source file while putting them in different segments at run time. I have been used to the method used in the MDS/Consulair linker/librarian, which allows you to resegment a program by simply changing the linker command file, rather than forcing you to recompile large chunks of the entire program after making source-file changes. It is possible to combine a number of segments into a new segment using the link command, but doesn't seem possible to rearrange modules into a different segment arrangement. When you start working with applications with LOTS of segments, this method starts to be somewhat bothersome.

Another weak point of the linker is the map that is generated. There is simply not enough information in it to be really useful, and it is also incompatible with TMON. Specifically, it doesn't give the A5-relative address of the global variables, which is a crucial piece of information.

Notes on the MPW Assembler: This is a genuine "down town" assembler, with features found in few other PC assemblers (heck, a lot of mainframe assemblers could be envious!). Various neat features were mentioned in the overview at the beginning of this article. However, as in much of MPW, while the details are in the manual, the practical implications of them aren't. A small sample routine is included to demonstrate the points written up here; it generates 1 16-byte line of a memory dump for use in a resource editor, debugger, etc.

• Templates. The major change for most programmers is going to be in the use of templates for laying out data structures; many people don't build complex macros, but everybody uses data. One great advantage of templates is that, used correctly, they can relieve you forever of having to calculate the sizes of stackframes, etc., manually. There are a few tricks for using these templates correctly.

(1) Use separate templates for parameters and stack frames. The parameter template should be an ordinary incrementing template, while the stack frame template is a decrementing template set so that its origin is at the A6 entry. By doing this, you can define symbolic constants for the sizes of the parameters and the local variables, killing one common source of errors and frustrations. A sample looks like this:

Parameters RECORD 0 incrementing template

OutStr DS.L 1 #3 VAR: where to put output

Offset DS.L 1 #2 VAR: offset in buffer

Handle DS.L 1 #1 handle to data buffer

Size EQU * use to adjust SP on exit ENDR

name PROC stackframes should be local

STACKFRAME RECORD {A6Link},DECR

Parms DS Parameters right amount allocated!

RetAddr DS.L 1 return address

A6Link DS.L 1 caller's A6

* local variables here

lVar1 DS.W 1

lVar2 DS.L 2

lVar3 DS.B 26

LocalSize EQU * use: LINK A6,#LocalSize

ENDR

(2) When you are using templates as shown above, note that the parameters are listed reversed from the Pascal order, starting with the last one. This is the only disadvantage of the technique, but is needed for it to work.

(3) Don't worry about the addresses/offsets displayed in the listing file! They are corrected by the assembler when it has the entire template in hand, and the values shown later in instructions will be the correct ones.

(4) Be judicious with the WITH and ENDWITH statements. Like their counterparts in Pascal, they can be more inclusive than intended.

(5) When you refer to entries in templates, be careful to "fully qualify" the names, or they will have the value of zero! Sizes, etc. should be used from "incrementing" templates. To adjust the stack on exit using the example above, use:

 ADDA #Parameters.Size,SP 

-NOT-

 ADDA #StackFrame.Size,SP

(6) I generally leave the warnings turned on. The assembler is quite thorough about detecting longer-than-necessary branches and calls, and the warnings let you shorten them immediately.

• DUMP and LOAD commands. These are similar to facilities in the Pascal compiler. The aim is to store pre-digested symbol tables (mostly of the standard equates) in files to save the time wasted in scanning them during each assembly. Unlike Pascal, the assembler won't automatically create a "dumpfile" if it's not found; therefore, the trick is to have a very small assembly program that does nothing except dump the system equates to a file for everybody else's use. That would have only a DUMP command, while all regular files would only have a LOAD command. Your makefile can be set up to cause the "dump application" to be run before anything else if you anticipate any changes in the equate files, or if you include some of your own files that may be fairly stable but not cast in cement.

A few quick experiments on a 2-floppy system shows that using the LOAD command saves about 11 seconds per assembly as compared with processing all the equates. As in any big project, every little savings is precious.

• Notes on Macros: this is a somewhat random set of notes on working with the macro facility in the assembler; I haven't used it enough to be an expert, but there are a few simple things that can simplify life:

(1) Labels in Macro expansions. It is not obvious that plain labels and @-labels aren't always adequate, but the assembler simply copies them directly into the output, so there is the potential for "duplicate symbol" errors. The way to avoid this is to use the &SYSINDEX feature. This keeps track of how many times each macro has been invoked, and makes that value available in the form of a 4-digit number. By including &SYSINDEX as part of every label in a macro, you can be certain that there will be no conflicts (at least until you've invoked the same macro for the 10,000th time...). Like this:

 BRA.S  X1&SYSINDEX
*various statements
X1&SYSINDEX ;more code......

which would be expanded to (on the first invocation):

 BRA.S  X10001
*various statements
X1&0001 ;more code......

(2) Use macros for generating large repetitive tables. Macro variables allow you to use indices in calculating table values, etc. Large lookup tables are often used for speed in time-critical applications (real-time games, for example) and are always tedious to enter by hand.

(3) Use keyword parameters to increase macro readability. One of the strong points of the IBM 360/370 assembler was the use of keyword parameters. Altho the I/O macros were always very complex, the average programmer didn't find them intimidating because he provided most of the values using keyword parameters. It is always more reassuring to write "..,channel=3,.." rather than worry if you are about to waste a half-day because the channel number should have been the 3rd parameter, not the 2nd.

Notes on MPW Pascal: MPW Pascal is going to make a lot of people happy, I think. I am relearning my Pascal after a year as a full-time C programmer, so I am probably missing points an experienced Lisa Pascal programmer would catch, but here goes anyway.

• It is quite Lisa-pascal compatible. I am informed by experienced Lisa Pascal programmers that MPW Pascal is essentially identical to it. This makes developers that have a heavy investment in Lisa Pascal (such as Odesta) obviously very happy. Also, it should be a snap to convert programs written in TML Pascal.

• The $LOAD facility increases speed considerably. This is powerful but somewhat picky. It does not seem to coexist well with the {$U} clause normally used to specify the source files needed in the USES clause. The following does work, however:

(a) In the file STARTUP that is executed when the shell is started, the definition of the variable {PInterfaces} has the full path of the directory containing the sources. This definition can contain multiple pathnames, separated by commas. A definition might be:

Set PInterfaces "{MPW}PInterfaces:,HD20:Project:"
Export PInterfaces

(b) In combination with the above, the FILENAME for a unit must be the same as the UNITNAME, so that the automatic search for UNITNAME.p will function correctly. Thus the full pathname for unit FileStuff is:

HD20:Project:FileStuff.p

(c) I have all the interface units packed together into a single load file, since they don't change:

{$LOAD HD20:Project:obs:ints.dumpfile}
Memtypes,Quickdraw,OSIntf,ToolIntf,PackIntf,MacPrint,

just using this saves an average of 50 seconds per compile after the first time thru which builds the load file!

(d) After the name $LOAD clause above, use an unnamed {$LOAD} to end the "dumping" and start processing the rest of the unit files individually, so they can change as often as necessary. Thus, the final arrangement in the main module would be:

{$LOAD HD20:Project:obs:ints.dumpfile}
Memtypes,Quickdraw,OSIntf,ToolIntf,PackIntf,MacPrint,
{$LOAD}
globals, otherUnit, anotherUnit;

• It produces darn good code. Walking thru the output code using TMON or DumpObj reveals that the MPW Pascal compiler does a very good job of optimizing its output. While compiler-generated code will almost never beat out hand-tuned code, MPW's overhead is very low. Commonly used variables are automatically kept in registers, using the WITH statement eliminates duplicate pointer-loading instructions, etc. Word has it that Apple is already working on improving the optimization processing for the next version of the compiler.

• It has object-oriented extensions built in. This has been the subject of several very good articles in MacTutor recently, and I haven't had a chance to play with MacApp yet, so I can't say much, except that it looks like this may actually improve programmer productivity, once the concepts involved sink in.

Brief Notes on MPW C: This is going to make the C programmers in the MPW world happy, without putting the existing C vendors out of business (for example, MPW C lacks the object extensions present in the Pascal package - you can be sure Consulair, Aztec, etc. are going to pick up on this!)

It is based on the Berkeley 4.2 BSD VAX implementation of the Portable C Compiler, and was written by Green Hills Software; the library is based on the Standard AT&T Unix System V library. Since Apple has stated in public that it intends to make Unix available on the Mac (undoubtedly the open Mac), it is to be hoped that the same compiler will be used there - to have two slightly incompatible compilers put out by Apple for the same machine would be greatly insane!

Various Notes

Note 1: here's how you do go about putting pictures or other resources into never-compiled resource files for input to Rez, using ResEdit:

(1) Start with the picture in the Scrapbook.

(2) At the "file" level of ResEdit, create a new file.

(3) Select the picture in the scrapbook, and copy it.

(4) Paste it into the brand-new file. A 'PICT' resource should be created.

(5) Change the ID of the new PICT resource to something conven- ient for later use (starting with 128 isn't a bad idea ).

(6) When you close the file, answer "yes" when it asks if you want to save the changes!

#UserStartUp - MPW Shell UserStartUp File
#
#Copyright Apple Computer, Inc. 1985, 1986
#All Rights Reserved.

#This file (UserStartUp) is executed from the StartUp    file, and can 
be used
#to override definitions made in StartUp, or to define   additional variables,
#exports, and aliases.  UserStartUp may also be used to  define menu 
items,
#open windows, etc.The file should be located in the directory containing
#the MPW Shell.

alias cd directory
alias clone duplicate
set ask "request 'Structure Level:' "
export ask
if "`request -d Pascal 'Pascal or C:'`" =~ /[Pp]as /
 "{MPW}macros:Pas_Macros"
else
 "{MPW}macros:c_macros"
End

set exit 0#so confirm won't kill things on 'no'
set scr "{MPW}scratch"; export scr
open "{scr}"
confirm 'Need the Rez menu?'
if {status} == 0
 begin
 # definitions to make nesting simpler...
 set br "request 'BoundsRect? (t,l,b,r)'" ; export br
 set tit "request 'Title?'"; export tit
 set cid "request 'ID?'"; export cid
 set st "request 'String?'"; export st
 "{MPW}macros:rez_macros"
 end
end

set gloss {MPW}glossary; export gloss
open "{gloss}"
"{MPW}macros:nifties"

set PasMatOpts "-a -k -l -n -r -@"
export PasMatOpts

Addmenu Nifties 'Formatted Hardcopy/9' 
'Pasmat "{active}" | Print -f Monaco -h -s 9 -hf Times -hs 12 -title 
"{active}"'


#GetResAttr Utlity
#This gets the resource-ID, name, and 
#preload attributes for a def
#Version 1.1

set exit 0
set rid "`request 'Resource ID?'`"
echo {rid} > {MPW}scratch #make available for outside use
set rname "`request 'Resource Name?'`"

echo -n "resource '{1}' ({rid}"
if "{rname}"   ""
 echo -n ",""{rname}"""
end

confirm 'Set Attributes? (default: Purgeable)'
if {status} == 0
  begin
  confirm "SysHeap?" ; set sys {status}
  confirm "Purgeable?" ; set pur {status}
  confirm "Locked?" ; set loc {status}
  confirm "Preload?" ; set pre {status}
  if {sys} == 0 ; echo -n ",SysHeap" ; end
  if {pur} == 0 ; echo -n ",Purgeable" ; end
  if {loc} == 0 ; echo -n ",Locked" ; end
  if {pre} == 0 ; echo -n ",Preload" ; end
  end
else
  echo -n ",Purgeable"
end

echo ") {"



#Interactive macros to build Rez definitions
#Frank Alviani - Tuesday, December 9, 1986 2:02:07 PM
#Used with another language menu; no 'top' or 'bottom'
#  macros needed

AddMenu Rez 'Alert' '"{MPW}macros:doAlert"'
AddMenu Rez 'Bundle' '"{MPW}macros:doBundle"'
AddMenu Rez 'Control' '"{MPW}macros:doControl"'
AddMenu Rez 'DITL' 
'begin;
 set lclItm 1;
 "{MPW}macros:getResAttrs" DITL;
 echo "{";
 end'
AddMenu Rez 'End DITL' 'replace \;\ "n}n};n" "{active}"'
AddMenu Rez 'DLOG' '"{MPW}macros:doDLOG"'
AddMenu Rez 'FREF' '"{MPW}macros:doFREF"'
AddMenu Rez 'MENU' '"{MPW}macros:doMENU"'
AddMenu Rez 'SIZE' '"{MPW}macros:doSize"'

AddMenu Rez 'STR ' 
'begin;
 "{MPW}macros:getResAttrs" "STR ";
 set str "`{st}`";
 echo "  "{str}"" ;
 echo "};";
 end'
 
AddMenu Rez 'STR#' 
'begin;
 "{MPW}macros:getResAttrs" "STR#";
 echo -n "{ ";
 loop;
 set str "`{st}`";
 break if ("{str}" == "") OR ({status} != 0);
 echo "  "{str}";" ;
 end;
 cut \;\ "{active}" ;
 echo "n}n};";
 end'
 
AddMenu Rez 'WIND' '"{MPW}macros:doWIND"'

AddMenu Rez '(-' ''

AddMenu Rez 'Button' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set tex "`{tit}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  button {{onoff},"{tex}"};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'Checkbox' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set tex "`{tit}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  Checkbox {{onoff},"{tex}"};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'RadioButton' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set tex "`{tit}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  RadioButton {{onoff},"{tex}"};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'ControlItem' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set cntlID "`{cid}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  Control {{onoff},{cntlID}};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'StaticText' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set tex "`{tit}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  StaticText {{onoff},"{tex}"};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'EditText' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set tex "`{tit}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  EditText {{onoff},"{tex}"};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'Icon' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set cntlID "`{cid}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  Icon {{onoff},{cntlID}};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'Picture' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  set cntlID "`{cid}`";
  echo "{{rr}},t/* {lclItm} */";
  echo "  Picture {{onoff},{cntlID}};";
  set lclItm `evaluate {lclItm} + 1`;
 end'
 
AddMenu Rez 'UserItem' 
'begin;
  set exit 0;
  set rr "`{br}`";
  confirm "Enabled?";
  if "{status}" == 0 ; set onoff enabled ;
   else ; set onoff disabled; end;
  echo "{{rr}},t/* {lclItm} */";
  echo "  UserItem {{onoff}};";
  set lclItm `evaluate {lclItm} + 1`;
 end'

AddMenu Rez '(-' ''

AddMenu Rez 'Shift Item' '"{MPW}macros:doShift"'
AddMenu Rez 'Adjust Width' '"{MPW}macros:doWiden"'
AddMenu Rez 'Adjust Height' '"{MPW}macros:doTaller"'


# doBundle Macro
#This interactively builds an BNDL Rez definition
#Frank Alviani Saturday, December 6, 1986 10:32:45 AM

set exit 0#so 'cancel' buttons not fatal

"{MPW}macros:getResAttrs" BNDL
set sig "`request 'Signature? (4 chars)'`"
set ver "`request -d 0 'Version Number?'`"
echo "'{sig}',t/* signature */"
echo -n "{ver},t/* version */n{"
set ct 1
set st 128
loop
 set ty "`request 'Resource type? (4 chars)'`"
 break if ({status} != 0) OR ({ty} == "")
 set ct "`request -d {ct} 'How many?'`"
 set act "`request -d {st} 'Starting actual ID?'`"
 set st {act}  #update default starting #
 set loc 0#local ID
 echo -n "'{ty}',t/* type */n {"
 loop
 if {loc} != 0 ; echo -n "," ; end
 echo -n "{loc},{act}"
 set loc `evaluate {loc} + 1`
 set act `evaluate {act} + 1`
 break if {loc} == {ct}
 end
 echo "};"
end
replace \;\ "n}};n" "{active}"

echo "ntype '{sig}' as 'STR ';"
echo "resource '{sig}' (0) {"
echo '"Skeleton Application - Version 1.0"'
echo "};"




# doFREF Macro
set exit 0#so 'cancel' buttons not fatal
"{MPW}macros:getResAttrs" FREF
set ty "`request 'File Type (4 chars)'`"
set in `request 'Icon ID'`
set tex "`request 'Title?'`"
set ty "'{ty}'"
echo "  {ty}, {in}, "{tex}"n};"




# Window Resource Macro
set exit 0#so 'cancel' buttons not fatal

"{MPW}macros:getResAttrs" WIND

set br "`request 'BoundsRect? (t,l,b,r)'`"
set ty `request -d d 'Box: 
 <D>oc|<B>ox|<P>lain|<A>lt|<N>ogro|<Z>oom|<R>nd'`
confirm 'Visible?'
if {status} == 0
 set vis "visible"
else
 set vis "invisible"
end
confirm 'GoAway?'
if {status} == 0
 set go "goAway"
else
 set go "noGoAway"
end
set ref `request -d 0 'Reference Constant?'`
set tex "`request -d Untitled 'Title?'`"

#determine window type
if {ty} =~ /[Dd] /
 set wtyp "documentProc"
else
 if {ty} =~ /[Bb] /
 set wtyp "dBoxProc"
 else
 if {ty} =~ /[Pp] /
 set wtyp "plainDBox"
 else
 if {ty} =~ /[Aa] /
 set wtyp "altDBoxProc"
 else
 if {ty} =~ /[Nn] /
 set wtyp "noGrowDocProc"
 else
 if {ty} =~ /[Zz] /
 set wtyp "zoomDocProc"
 else
 if {ty} =~ /[Rr] /
 set wtyp "rDocProc"
 else
 set wtyp "unKnown"
 end
 end
 end
 end
 end
 end
end

#fill out rez entry
echo "  {{br}},"
echo "  {wtyp},"
echo "  {vis},"
echo "  {go},"
echo "  {ref},t/* refcon */"
echo "  "{tex}""
echo "};"



# Create Menu Resource
set exit 0#so 'cancel' buttons not fatal

"{MPW}macros:getResAttrs" MENU
#set id "`request 'Menu ID?'`"
set mdef "`request -d textMenuProc 'ID of MenuDef proc?'`"
set flags "`request -d allEnabled 'Enable flags (8 hex no.)'`"
set mEnable "`request -d enabled 'Enable Menu?'`"
set title "`request 'Title? (type "apple" for that menu)'`"

#put overall info together here
copy •:/n/ "{scr}" #get saved ID
paste § "{active}"
echo ",ttt/* ID */"
echo "{mdef},t/* menu def proc ID */"
echo "{flags},tt/* item flags */"
echo "{mEnable},ttt/* menu enable */"
if {title} =~ /[Aa]pple/
 echo "{title},"
else
 echo ""{title}","
end
echo -n "t{ "

#put together menu items
set ict 0 #item count
loop
 set ititle "`request 'Item?'`"
 break if ({status} != 0) OR ("{ititle}" == "")
 set icon "noIcon"
 set key "noKey"
 set char "noMark"
 set style "plain"
 confirm "Skip item attributes?"
 if {status} != 0#wants attibutes
 begin
 set icon "`request -d noIcon 'Icon No? (1-based; I will adust)'`"
 if {icon} != "noIcon"
 set icon `evaluate {icon} + 256`
 end
 set key "`request -d noKey 'Key equivalent?'`"
 confirm "Check?"
 if {status} == 0
 set char check
 else
 set char "noMark"
 end
 confirm "Bold?"
 if {status} == 0
 set style bold
 else
 set style "plain"
 end
 end
 end
 if {ict} != 0 ; echo -n "t  " ; end# alignment
 echo ""{ititle}","
 echo -n "t  {icon},"

 if {key} != "noKey" #don't quote noKey
 echo -n ""{key}","
 else
 echo -n "{key},"
 end

 echo "{char},{style};"
 set ict `evaluate {ict} + 1`
end

cut \;\ "{active}" #no ; after last entry
echo "nt}n};"



#This interactively builds a 'DLOG' rez definition
#Frank Alviani - Tuesday, December 9, 1986 9:40:06 AM

set exit 0#so 'cancel' buttons not fatal
echo "/* `request 'Purpose:'` */"
"{MPW}macros:getResAttrs" DLOG

set br "`request 'BoundsRect? (t,l,b,r)'`"
set ty `request -d d 'Box: <D>oc|<B>ox|<P>lain|<A>lt|<N>ogro|<Z>oom|<R>nd'`
confirm 'Visible?'
if {status} == 0
 set vis "visible"
else
 set vis "invisible"
end
confirm 'GoAway?'
if {status} == 0
 set go "goAway"
else
 set go "noGoAway"
end
set ref `request -d 0 'Reference Constant?'`
set ttl "`request -d Untitled 'Title?'`"

#determine window type
if {ty} =~ /[Dd] /
 set wtyp "documentProc"
else
 if {ty} =~ /[Bb] /
 set wtyp "dBoxProc"
 else
 if {ty} =~ /[Pp] /
 set wtyp "plainDBox"
 else
 if {ty} =~ /[Aa] /
 set wtyp "altDBoxProc"
 else
 if {ty} =~ /[Nn] /
 set wtyp "noGrowDocProc"
 else
 if {ty} =~ /[Zz] /
 set wtyp "zoomProc"
 else
 if {ty} =~ /[Rr] /
 set wtyp "rDocProc"
 else
 set wtyp "unKnown"
 end
 end
 end
 end
 end
 end
end

#fill out rez entry
echo "  {{br}},"
echo "  {wtyp},"
echo "  {vis},"
echo "  {go},"
echo "  {ref},t/* refcon */"
copy •:/n/ "{scr}" #get saved ID
paste § "{active}"
echo ",t/* DITL ID# */"
echo "  "{ttl}""
echo "};"




#This interactively builds a 'CNTL' rez definition
#Frank Alviani  -  Monday, December 8, 1986 1:33:54 PM

set exit 0#so 'cancel' buttons not fatal
echo "/* `request 'Purpose:'` */"
"{MPW}macros:getResAttrs" CNTL
set p `request 'Type: <B>tn|<C>heckbox|<R>adio|<S>roll (+<F>ont)'`
set rr "`request 'BoundRect? (t,l,b,r)'`"
set val `request -d 0 'Value?'`
set min `request -d 0 'Minimum?'`
set max `request -d 0 'Maximum?'`
confirm 'Visible?'
if {status} == 0
 set vis "visible"
else
 set vis "invisible"
end
set ref `request -d 0 'Reference Constant?'`
set ttl `request -d x 'Title?'`

#determine type
if {p} =~ /[Bb] /
 set ctyp "pushButProc" 
else
 if {p} =~ /[Cc] /
 set ctyp "checkBoxProc"
 else
 if {p} =~ /[Rr] /
 set ctyp "radioButProc"
 else
 if {p} =~ /[Ss] /
 set ctyp "scrollBarProc"
 end
 end
 end
end

echo "  {{rr}},"
echo "  {val},t/* value */"
echo "  {vis},"
echo "  {max},t/* max */"
echo "  {min},t/* min */"
echo -n "  {ctyp}"
if {p} =~ / [Ff] /
 echo "UseWFont,t/* type */"
else
 echo ",t/* type */"
end
echo "  {ref},t/* refcon */"
echo "  "{ttl}""
echo "};"

Additional resource macros and make utilities for Pascal and C are available on the source code disk for this issue.

 

Community Search:
MacTech Search:

Software Updates via MacUpdate

Latest Forum Discussions

See All

Tokkun Studio unveils alpha trailer for...
We are back on the MMORPG news train, and this time it comes from the sort of international developers Tokkun Studio. They are based in France and Japan, so it counts. Anyway, semantics aside, they have released an alpha trailer for the upcoming... | Read more »
Win a host of exclusive in-game Honor of...
To celebrate its latest Jujutsu Kaisen crossover event, Honor of Kings is offering a bounty of login and achievement rewards kicking off the holiday season early. [Read more] | Read more »
Miraibo GO comes out swinging hard as it...
Having just launched what feels like yesterday, Dreamcube Studio is wasting no time adding events to their open-world survival Miraibo GO. Abyssal Souls arrives relatively in time for the spooky season and brings with it horrifying new partners to... | Read more »
Ditch the heavy binders and high price t...
As fun as the real-world equivalent and the very old Game Boy version are, the Pokemon Trading Card games have historically been received poorly on mobile. It is a very strange and confusing trend, but one that The Pokemon Company is determined to... | Read more »
Peace amongst mobile gamers is now shatt...
Some of the crazy folk tales from gaming have undoubtedly come from the EVE universe. Stories of spying, betrayal, and epic battles have entered history, and now the franchise expands as CCP Games launches EVE Galaxy Conquest, a free-to-play 4x... | Read more »
Lord of Nazarick, the turn-based RPG bas...
Crunchyroll and A PLUS JAPAN have just confirmed that Lord of Nazarick, their turn-based RPG based on the popular OVERLORD anime, is now available for iOS and Android. Starting today at 2PM CET, fans can download the game from Google Play and the... | Read more »
Digital Extremes' recent Devstream...
If you are anything like me you are impatiently waiting for Warframe: 1999 whilst simultaneously cursing the fact Excalibur Prime is permanently Vault locked. To keep us fed during our wait, Digital Extremes hosted a Double Devstream to dish out a... | Read more »
The Frozen Canvas adds a splash of colou...
It is time to grab your gloves and layer up, as Torchlight: Infinite is diving into the frozen tundra in its sixth season. The Frozen Canvas is a colourful new update that brings a stylish flair to the Netherrealm and puts creativity in the... | Read more »
Back When AOL WAS the Internet – The Tou...
In Episode 606 of The TouchArcade Show we kick things off talking about my plans for this weekend, which has resulted in this week’s show being a bit shorter than normal. We also go over some more updates on our Patreon situation, which has been... | Read more »
Creative Assembly's latest mobile p...
The Total War series has been slowly trickling onto mobile, which is a fantastic thing because most, if not all, of them are incredibly great fun. Creative Assembly's latest to get the Feral Interactive treatment into portable form is Total War:... | Read more »

Price Scanner via MacPrices.net

Early Black Friday Deal: Apple’s newly upgrad...
Amazon has Apple 13″ MacBook Airs with M2 CPUs and 16GB of RAM on early Black Friday sale for $200 off MSRP, only $799. Their prices are the lowest currently available for these newly upgraded 13″ M2... Read more
13-inch 8GB M2 MacBook Airs for $749, $250 of...
Best Buy has Apple 13″ MacBook Airs with M2 CPUs and 8GB of RAM in stock and on sale on their online store for $250 off MSRP. Prices start at $749. Their prices are the lowest currently available for... Read more
Amazon is offering an early Black Friday $100...
Amazon is offering early Black Friday discounts on Apple’s new 2024 WiFi iPad minis ranging up to $100 off MSRP, each with free shipping. These are the lowest prices available for new minis anywhere... Read more
Price Drop! Clearance 14-inch M3 MacBook Pros...
Best Buy is offering a $500 discount on clearance 14″ M3 MacBook Pros on their online store this week with prices available starting at only $1099. Prices valid for online orders only, in-store... Read more
Apple AirPods Pro with USB-C on early Black F...
A couple of Apple retailers are offering $70 (28%) discounts on Apple’s AirPods Pro with USB-C (and hearing aid capabilities) this weekend. These are early AirPods Black Friday discounts if you’re... Read more
Price drop! 13-inch M3 MacBook Airs now avail...
With yesterday’s across-the-board MacBook Air upgrade to 16GB of RAM standard, Apple has dropped prices on clearance 13″ 8GB M3 MacBook Airs, Certified Refurbished, to a new low starting at only $829... Read more
Price drop! Apple 15-inch M3 MacBook Airs now...
With yesterday’s release of 15-inch M3 MacBook Airs with 16GB of RAM standard, Apple has dropped prices on clearance Certified Refurbished 15″ 8GB M3 MacBook Airs to a new low starting at only $999.... Read more
Apple has clearance 15-inch M2 MacBook Airs a...
Apple has clearance, Certified Refurbished, 15″ M2 MacBook Airs now available starting at $929 and ranging up to $410 off original MSRP. These are the cheapest 15″ MacBook Airs for sale today at... Read more
Apple drops prices on 13-inch M2 MacBook Airs...
Apple has dropped prices on 13″ M2 MacBook Airs to a new low of only $749 in their Certified Refurbished store. These are the cheapest M2-powered MacBooks for sale at Apple. Apple’s one-year warranty... Read more
Clearance 13-inch M1 MacBook Airs available a...
Apple has clearance 13″ M1 MacBook Airs, Certified Refurbished, now available for $679 for 8-Core CPU/7-Core GPU/256GB models. Apple’s one-year warranty is included, shipping is free, and each... Read more

Jobs Board

Seasonal Cashier - *Apple* Blossom Mall - J...
Seasonal Cashier - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Read more
Seasonal Fine Jewelry Commission Associate -...
…Fine Jewelry Commission Associate - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) Read more
Seasonal Operations Associate - *Apple* Blo...
Seasonal Operations Associate - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Read more
Hair Stylist - *Apple* Blossom Mall - JCPen...
Hair Stylist - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Blossom Read more
Cashier - *Apple* Blossom Mall - JCPenney (...
Cashier - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Blossom Mall Read more
All contents are Copyright 1984-2011 by Xplain Corporation. All rights reserved. Theme designed by Icreon.