Friday, December 25, 2009

Chapter 4: More Dynamic Drop-Down Menus and the Incredible, Versatile :Target














Chapter 4: More Dynamic Drop-Down Menus and the Incredible, Versatile :Target





Overview



Chapter 3 might have made it sound impossible to create dynamic drop-down menus using only CSS and XHTML, at least if you want to use only a click to open a menu. Although the points made were true, there is an approach using only CSS that allows a user to click to open a menu. That approach, which I focus on in this chapter, is useful for a number of things, from creating drop-down menus to custom pop-up dialogs.


I decided not to go into this approach in Chapter 3 because it does have some limitations, albeit limitations that are not impossible to work around. I also decided to focus on this approach in a chapter all its own, because it is possible to do more than merely create drop-down menus. The technique presented here can also be used to create other stylistic changes when a user clicks on something within a web document. For instance, it can be used to show a custom pop-up dialog, or to create tab-based navigation where the content of every tab exists within the same document instead of in separate documents, as demonstrated in Chapter 1.


The great benefit of this approach is that it conforms to the accessibility requirements presented in Chapter 3; that is, the document remains accessible from the keyboard, and all with very little or no JavaScript required. You also get an idea of how this technique works for the project in Chapter 5, which further explores the possibilities of using :target in a web project.


The most compelling benefit of this approach is that it removes the need for complex JavaScript. I do not feel that using JavaScript is a bad thing, or that JavaScript should be avoided — in fact quite the contrary, JavaScript makes rich, interactive applications possible. However, I believe that a web designer should take the easiest, most intuitive, most efficient route possible, while preserving accessibility. This approach, when compared to the JavaScript I prepared for Chapter 3, wins hands down for simplicity and superior accessibility. The next section elaborates on the components used in the design of this project and discusses this in more detail.














Recipe 20.3. Printing a Stack Trace










Recipe 20.3. Printing a Stack Trace



20.3.1. Problem


You want to
know what's happening at a specific point in your program, and what happened leading up to that point.




20.3.2. Solution


Use
debug_print_backtrace( ):


function stooges() {
print "woo woo woo!\n";
larry();
}

function larry() {
curly();
}

function curly() {
moe();
}

function moe() {
debug_print_backtrace();
}

stooges();



This will print:


woo woo woo!
#0 moe() called at [backtrace.php:14]
#1 curly() called at [backtrace.php:10]
#2 larry() called at [backtrace.php:6]
#3 stooges() called at [backtrace.php:21]





20.3.3. Discussion


The debug_backtrace( )
function was introduced in PHP 4.3.0, followed by the handy debug_print_backtrace( )
function in PHP 5.0.0. This combination allows you to quickly get a sense of what has been been going on in your application immediately before you called a particular function.


The more complicated your application, the more information you can expect to have returned from the backtrace functions. For debugging larger codebases, you may achieve bug-hunting success more quickly using a full debugging extension, such as


Xdebug, or an integrated development environment (IDE), such as

PHPEdit or Zend Studio, that supports setting breakpoints, stepping in and out of blocks of code, watching the evolution of variables, and more.


If all you need is a little more information than you can get from sprinkling print 'Here I am on line ' . __LINE__; statements throughout your code, debug_backtrace( ) and/or debug_print_backtrace( ) will suit your needs well.


If you're still using PHP 4 and want the PHP 5only debug_print_backtrace( ) function, you can use PEAR's

PHP_Compat compatibility package. PHP_Compat provides an implementation of debug_print_backtrace( ) that is identical to the native PHP 5 function.




20.3.4. See Also


Documentation on debug_backtrace( ) at http://www.php.net/debug-backtrace and on debug_print_backtrace( ) at http://www.php.net/debug-print-backtrace; the PEAR PHP_Compat package at http://pear.php.net/package/PHP_Compat; Zend Studio IDE at http://www.zend.com/products/zend_studio; PHPEdit IDE at http://www.waterproof.fr/products/PHPEdit/.













Section 5.9.  Summary










5.9. Summary


In this chapter, we discussed the difference between SOX and COBIT. We also discussed a basic process for developing IT strategic plans in an effort to comply with the Planning and Organization domain. Finally, we looked at some real-world examples of forms and processes used to comply with the Sarbanes-Oxley Act. In summarizing this chapter, there are three fundamental things you should take away with you:


  • Let your unique organizational structure drive the applicable domain items.

  • When developing processes, ensure that they follow a good quality methodology, such as PDCA (Plan, Do, Check, and Act).

  • Above all, if you have existing processes that are good sound processes and are already ingrained within the organization, customize and modify them to work within COBIT.


We also explored the definition and approval routing of your IT business policies, which form the core of your IT strategy and are the basis from which all procedures grow. Several policies are outlined as a representative set of items you will need to consider for SOX compliance. You can define or modify your own policies; we give you the details on how to accomplish this. You can also define or modify the policy approval workflow process if it does not suit your needs.


We also looked at the first concrete examples on the Live CD in the context of planning and organization. Once you have identified your controls as an organization, it is important to state this in a policy that can later be applied by implementing solutions later in the book that fulfill the requirements of your policies. In addition, we introduced the "approval" type of workflow, the first of many workflow categories we will explore in the remaining chapters. The approval workflow in this example demonstrates the ability to route specific versions of your policies to a chain of approval.












The Life of an I/O Request



[ Team LiB ]





The Life of an I/O Request


DB2 issues I/O requests via the OS Kernel routines. If SMS or DMS file containers are being used, these requests go through the OS file system code. If DMS device containers are used, DB2 issues I/O requests that are passed to the appropriate device drivers by the OS Kernel. The file system is not used in this case. In either case, the actual request is transferred to the host adapter. Figure 6.2 is an example of an I/O request using the Small Computer System Interface (SCSI).


Figure 6.2. The life of an I/O.


In Figure 6.2, DB2 issues a call (I/O request) to the OS. The OS Kernel routine passes the request to either the OS file system, or in the case of DMS devices, directly to the SCSI device driver. The SCSI device driver sends the appropriate SCSI command to the target device. The target device receives the request and immediately checks its own internal buffers for the data; if not found, the device must perform a disk seek operation to retrieve the requested data. The device issues the seek and disconnects from the bus to wait for the seek to complete. In the meantime another request can be sent by the adapter over the bus to a different disk. The second disk device receives the request, checks its own buffer, and starts the seek to retrieve the data and disconnect from the bus. In the meantime, the seek to the first device completes and this device gains control of the bus and transfers the data. The I/O operation is thus complete.


It is important to note that Figure 6.2 illustrates a single request; however the process works the same way for multiple requests. Multiple requests can be processed if containers are on separate physical disks. DB2 will use a separate prefetcher per container.


In that case, while the first disk is seeking and has given up control of the bus, the next request from the queue is processed, for example, to Disk2. Disk2 checks its buffer and has to do a seek, causing it to give up control of the bus. Meanwhile, the I/O on Disk1 completes so Disk1 can now transfer the data while Disk2 seeks, and so on. See Figure 6.3 for a sample tablespace disk layout.


Figure 6.3. Sample disk layout.


In the example, three physical disks enable multiple disk seek operations to occur. And, having the data spread over three host adapters ensures that required throughput and high availability can be attained. These disks are configured in a RAID-1 configuration. However, the second copy of the data is not shown. In the next section, we show an example that illustrates how to design and configure tablespaces in a RAID-5 ESS environment.





    [ Team LiB ]



    Section 8.2.  What Does Monitoring Mean?










    8.2. What Does Monitoring Mean?


    Chapter 2 established the following high-level definition for the COBIT monitoring domain: "The monitoring phase uses the SLAs or baseline established in the subsequent phases to allow an IT organization to not only gauge how they are performing against expectation but also provides them with an opportunity to be proactive."


    Previous chapters discussed good quality practices; Plan, Do, Check, Act (PDCA), and continuous improvement. This chapter attempts to accomplish three things:


    • Give you more information on Deming and his quality system.

    • Illustrate PDCA more clearly.

    • Demonstrate, via Deming's PDCA cycle, that monitoring is not the end of the process, but rather the beginning.



    8.2.1. Deming's PDCA Cycle



    In the 1950s, W. Edwards Deming developed a quality system for the continuous improvement of business processes. Deming's quality system contended that business processes should be analyzed and measured to identify the sources of variations that cause products to deviate from customer requirements. He proposed that business processes be placed in a continuous feedback loop so that managers could identify and change the parts of the process that needed improvement. To illustrate his continuous improvement system, Deming developed a diagram using four arrows in a cyclical pat-tern. This diagram is commonly known as the PDCA cycle (see Figure 8.1).



    Figure 8-1. Deming's PDCA




    The sections of the diagram are defined as:


    • PLAN Design or revise business-process components to improve results

    • DO Implement the plan and measure its performance

    • CHECK Assess the measurements and report the results to the decision makers

    • ACT Decide on the changes that are needed to improve the process


    Although Deming's focus was on industrial production processes, his method and philosophies just as easily applied to modern business practices. If you look carefully at the COBIT Guidelines, you will see a strong resemblance to the Deming PDCA model. Whether intentionally or by accident, these guidelines illustrate the point that good quality business practices endure the test of time.


    How does this apply to the Sarbanes-Oxley Act of 2002 (SOX) and COBIT? Most monitoring activities in COBIT IV: Domain Monitoring come from service level agreements (SLAs). As much as possible, monitoring activities should be automated via Open Source tools such as Nagios and eGroupware. Keep in mind that when determining your thresholds, you may want to set them slightly below your SLA thresholds so that you have additional time to react and proactively correct problems prior to a service interruption.


    This chapter looks at the specifics of each control objective, and attempts to summarize and distill those that lend themselves to small and medium-sized companies. If a particular control objective or an individual item is not applicable to BuiltRight or NuStuff, it generally does not apply to small to medium-sized companies. (For a complete list of the COBIT Guidelines, please see Appendix A.)




    8.2.2. Monitor the Processes


    This section discusses monitoring processes and activities associated with ensuring that previously defined systems and control objectives perform as expected.



    8.2.2.1. Assessing Performance (SOX and Repositioning)

    This process should include key performance indicators and critical success factors, and be performed on a continuous basis utilizing good quality practices and concepts. As discussed previously, these performance indicators must be SLA-based.




    8.2.2.2. Assessing Customer Satisfaction (SOX and Repositioning)


    Customer satisfaction should be measured at regular intervals, and any shortfalls should be addressed as part of a continuous improvement process. Again, the measurement criteria should be based on SLAs. As part of the normal course of operations, internal controls must be monitored for effectiveness through management and supervisory activities. As with Deming, any deviations must require analysis and corrective action plan(s). Also, these deviations must be reported to the individual responsible for their function and at least one level of management above. Any serious deviations should be immediately reported to executive management. This particular control objective is critical in the development of processes and procedures for SOX compliance.





    8.2.3. Assess Internal Control Adequacy



    Once you have implemented the various policies, processes, and procedures, and have obtained SOX compliance, you must sustain your new environment. This is where the various Open Source tools identified in this book pay off. Because the COBIT guidelines were developed in 1996, a lot of the recommended "Internal Control Adequacy" assessing activities appear to have been incorporated into the SOX compliance process.




    8.2.4. Obtain Independent Assurance


    Although the control objectives in this section have no bearing on Sarbanes-Oxley Compliance, they are noteworthy to review with regards to possibly adding credence to the effectiveness of an IT organization after obtaining Sarbanes-Oxley Compliance and/or any repositioning efforts.




    8.2.5. Provide for Independent Audit


    The control objectives in this section aren't required to comply with Sarbanes-Oxley, but because these control objectives are what Sarbanes-Oxley Compliance is all about, we felt compelled to list them and provide a few insights. As unfortunate as it is, most small to medium-sized companies can't afford the staffing on a full-time basis to comply with this COBIT section or periodically perform self-audits. However, what might be more feasible and realistic is to designate an audit team made up of existing employees. The main caveat to keep in mind is that the employee performing the audit of a department cannot work within the audited department. If the luxury of budgetary funding does exist at your organization, we would advise the periodic use of an independent audit firm, rather than one of the big four, to ensure your controls are still effective. The reason for using an independent audit firm is because the impartiality of the independent audit firm will lend more credence to the audit findings and your audit firm.













    Access to DB2 Universal Database



    [ Team LiB ]






    Access to DB2 Universal Database


    DB2 uses external facilities to provide a set of user and group validation and management functions. Users must log on through the external facilities by providing a username and a password. The security facilities validate the username and password provided to ensure that access for this user is allowed.


    You need to have a Windows username that will be used to administer DB2. The username must belong to the Administrators group and be a valid DB2 username. In many cases, DB2 creates a username during installation called db2admin that can be used for administering DB2 and setting up the security for the users on your system.


    After successful authentication, access to objects within a DB2 instance is controlled by granting authorities or privileges to users or groups. Authorities are granted to users to perform administrative tasks on DB2 objects such as loading or backing up data. Privileges are granted to users to access or update data. See Figure 7.2 for the possible DB2 authorities and privileges allowed.



    Figure 7.2. Hierarchy of authorities and privileges.






    Note



    By default, System Administration (SYSADM) privileges are granted to any valid DB2 username that belongs to the Administrators group on Windows.




    You can change the users who have administrator privileges for each DB2 instance by changing the SYSADM_GROUP parameter. Before you do, however, you need to ensure that the group exists. To check whether this group exists, use the Windows User Manager administrative tool (choose Start | Programs | Administrative Tools | User Manager). If the group exists, it's listed in the lower section of the User Manager window.


    To use another group as the System Administrative group (SYSADM_GROUP), update the Database Manager Configuration file. To change SYSADM_GROUP on the server instance, follow these steps:








    1. In the Control Center, click the + sign beside the Systems icon to list all the systems known to your workstation, and then click the + sign for the system containing the instance you want to update.


    2. Right-click the instance that you want to change�for example, DB2�and select Configure Parameters from the pop-up menu. The DBM Configuration dialog box opens.


    3. The Administration section shows the configuration parameters associated with administration. In the System Administration Authority Group text box, type the name of an existing group to which you want to assign this privilege. The Change DBM Configuration Parameter dialog box appears as you begin to type (see Figure 7.3).



      Figure 7.3. DBM Configuration � SYSADM_GROUP options.






    4. Click OK.


    5. Stop all applications that are using DB2, including the Control Center. When the application or the Control Center is restarted, the new value for SYSADM_GROUP is used.



    You can use these same steps to change the SYSCTRL_GROUP and SYSMAINT_GROUP parameters.






      [ Team LiB ]



      Utilizing Properties for Tables and Columns



      [ Team LiB ]









      Utilizing Properties for Tables and Columns


      Tables and columns, like other objects in your database, have properties that allow you to control the data that is going into your tables. For example, in the Customers table, you can see the properties for the first column, called CustomerID. The extent to which you use the properties depends on what your needs are.


      At the table level, you also have properties that you can utilize that help you create and enforce business rules, which are discussed later in this chapter.


      You will create your tables by breaking down your data into logical entities. When you do so, you need to keep in mind how you break them down, and you need to break them down so that they are created in what is called normalization.


      SQL Server has come a long way over the years. For every version, Microsoft works hard not only to make SQL Server more powerful, but also easier to work with. This includes tools that come with the product and from other applications. Visual Studio .NET is a good example of tools for working with SQL Server from another product.


      If you are not familiar with databases, here's a quick overview. Databases allow you to work with data in a manner that reflects the real world on the computer. You can take a real subject, such as Customers, and store that information in tables. A file cabinet is analogous to a database. Within the file cabinet you may have your client folders. Other folders might contain information on Orders or Invoices. One of these folders could be compared to a table of customers. Within the Customers folder, you might have individual pages of information on a customer. Each page that you have on an individual customer would be a row, or a record within a table. On each page, you would have pieces of information such as Customer Name, Address, Phone, and so on. These would be fields, or columns, within each row.


      In a database, you will also have objects that allow you to query information within tables and update information. In SQL Server, you will use Views, Stored Procedures, and Functions to view and update data within the database. To use these objects, you need to be able to create them. To create a database along with its tables in SQL Server, you can use code or tools that came with SQL, such as the Enterprise Manager, if you have installed one of the versions that include these tools.


      Fortunately, you can use tools that are built within Visual Studio .NET to create and modify your databases. The primary tool you will use is called the Server Explorer, as shown in Figure 2.1.


      Figure 2.1. From the Server Explorer within Visual Studio .NET, you can perform most of the tasks that are necessary to maintain a database.







        [ Team LiB ]



        12.5 Work with Datasets and XML



        [ Team LiB ]


































































        12.5 Work with Datasets and XML


        Sometimes, I have to pull XML documents into datasets and vice versa. How do I accomplish this using .NET?


        Technique


        .NET has developed a number of ways to utilize datasets and XML together. The simplest use is pushing data between the two. To do this, you have the two methods belonging to the DataSet object: ReadXML and WriteXML. For both of these methods, you need to provide a filename to read or write the XML document to.


        To demonstrate how to take advantage of these methods, I created a form that looks similar to the other How-Tos showing how to write XML documents. However, for this example, I also added another button and data grid that will show the data after reading from the XML document.


        Steps


        Open and run the Visual Basic .NET�Chapter 12 solution. From the main Web page, click on the hyperlink with the caption How-To 12.5: Working with Datasets and XML. As with How-To 12.1, when the page loads, you can enter a few names. Enter the last and first names, and then click the button labeled Add to DataTable. When you have added a few names, click the button labeled Create XML File. Using Explorer, open the file created in C:\ called test.xml. If you click Read XML File, you will see the same data because it was read from text.xml (see Figure 12.5).










        1. Create a Web Form. Then place the Labels, TextBoxes, Buttons, and DataGrid objects as seen in Figure 12.5 on the form with the properties set as in Table 12.9.
















          Table 12.9. Label, TextBox, and Button Control Property Settings

          Object



          Property



          Setting



          Label



          Text



          Last Name



          TextBox



          ID



        txtLastName



        Label



        Text



        First Name



        TextBox



        ID



        txtFirstName



        Button



        ID



        btnAdd


         

        Text



        Add to DataTable



        Button



        ID



        btnCreateXMLFile


         

        Text



        Create XML File



        DataGrid



        ID



        dgDataToWrite



        Button



        ID



        btnReadFile



        DataGrid



        ID



        dgResultsFromXML



        HyperLink



        ID



        hplReturnToMain


         

        NavigateURL



        wfrmMain.aspx


      • Add the following line to the code module of the form. Then place it under the line that reads Web Form Designer Generated Code.



        Dim mdtData As New DataTable()
        Dim mdsData As New DataSet()
      • Add the code in Listing 12.11 to the Load event of the page. If the data table has not been saved to the Session object, then it is created from scratch by first creating the data columns and then adding them to the data table. The DataTable object is then saved to the Session object with the name MyDataTable. A DataSet object is also created because some of the XML methods must be used from the DataSet object, rather than at the DataTable level. If the Session objects entry already exists, it is assigned back to the module variable mdtData and mdsData. Last, the data table is bound to the DataGrid object by calling the BindTheGrid routine, which is described in the next step.


        Listing 12.11 wfrmHowTo12_5.aspx.vb: Creating a DataTable Object from Scratch


        Private Sub Page_Load(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) Handles MyBase.Load

        'Put user code to initialize the page here
        If (Session("MyDataTable") Is Nothing) Then

        Dim dcFirstName As New DataColumn()

        dcFirstName.ColumnName = "FirstName"
        dcFirstName.Caption = "First Name"

        mdtData.Columns.Add(dcFirstName)

        Dim dcLastName As New DataColumn()

        dcLastName.ColumnName = "LastName"
        dcLastName.Caption = "Last Name"

        mdtData.Columns.Add(dcLastName)
        mdsData.Tables.Add(mdtData)

        Session("MyDataTable") = mdtData
        Session("MyDataSet") = mdsData

        Else
        mdtData = CType(Session("MyDataTable"), DataTable)
        End If

        BindTheGrid()

        End Sub
      • Create the routine BindTheGrid, shown in Listing 12.12, in the code module for the page.


        Listing 12.12 wfrmHowTo12_5.aspx.vb: Binding the Data Table to the Data Grid


        Sub BindTheGrid()

        dgDataToWrite.DataSource = mdtData
        dgDataToWrite.DataBind()

        End Sub
      • Add the code in Listing 12.13 to the Click event of the btnAdd button. This routine starts off by calling the NewRow method off the mdtData data table, thus creating a new DataRow object. The two columns in drNew are replaced with the values in txtLastName and txtFirstName. The new row is added to the data table, and the text boxes are cleared. Last, mdtData is rebound to the data grid by calling BindTheGrid.


        Listing 12.13 wfrmHowTo12_5.aspx.vb: Adding Data to the Data Table and Then Rebinding the Data Grid


        Private Sub btnAdd_Click(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) Handles btnAdd.Click

        Dim drNew As DataRow

        drNew = mdtData.NewRow()

        drNew.Item("LastName") = Me.txtLastName.Text
        drNew.Item("FirstName") = Me.txtFirstName.Text

        mdtData.Rows.Add(drNew)

        Me.txtLastName.Text = ""
        Me.txtFirstName.Text = ""

        BindTheGrid()

        End Sub
      • Add the code in Listing 12.14 to the event of the btnCreateXMLFile button. After loading the dataset from the Session object, the WriteXML method is invoked to save the data into an XML document.


        Listing 12.14 wfrmHowTo12_5.aspx.vb: Creating the XML Document from the Dataset


        Private Sub btnCreateXMLFile_Click(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) Handles btnCreateXMLFile.Click

        mdsData = CType(Session("MyDataset"), DataSet)
        mdsData.WriteXml("c:\Test.xml")

        End Sub
      • Add the code in Listing 12.14 to the Click event of the btnReadFile button. Here, the code reads the XML document by using the ReadXML method off the dsXMLData DataSet object and then binds it to a DataGrid object.


        Listing 12.15 wfrmHowTo12_5.aspx.vb: Reading the XML Document Back into the Dataset


        Private Sub btnReadFile_Click(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) Handles btnReadFile.Click
        Dim dsXMLData As DataSet = New DataSet()

        dsXMLData.ReadXml("c:\Test.xml")

        Me.dgResultsFromXML.DataSource = dsXMLData
        Me.dgResultsFromXML.DataBind()

        End Sub

      • Figure 12.5. This XML document was created using XMLTextWriter.



        Comments


        As you can see, for both reading and writing XML document from and to datasets, Microsoft has given us some easy commands to accomplish the task. However, remember that you do have the control over the format of the XML document that you have using the other methods, such as using the DOM.








          [ Team LiB ]



          Recipe 5.7. Encapsulating Complex Data Types in a String










          Recipe 5.7. Encapsulating Complex Data Types in a String



          5.7.1. Problem


          You


          want a string representation of an array or object for storage in a file or database. This string should be easily reconstitutable into the original array or object.




          5.7.2. Solution


          Use
          serialize( ) to encode variables and their values into a textual form:


          $pantry = array('sugar' => '2 lbs.','butter' => '3 sticks');
          $fp = fopen('/tmp/pantry','w') or die ("Can't open pantry");
          fputs($fp,serialize($pantry));
          fclose($fp);



          To recreate the variables, use
          unserialize( ):


          $new_pantry = unserialize(file_get_contents('/tmp/pantry'));





          5.7.3. Discussion


          The serialized string that is reconstituted into $pantry looks like:


          a:2:{s:5:"sugar";s:6:"2 lbs.";s:6:"butter";s:8:"3 sticks";}



          This stores enough information to bring back all the values in the array, but the variable name itself isn't stored in the serialized representation.


          When passing serialized data from page to page in a URL, call
          urlencode( ) on the data to make sure URL metacharacters are escaped in it:


          $shopping_cart = array('Poppy Seed Bagel' => 2,
          'Plain Bagel' => 1,
          'Lox' => 4);
          print '<a href="next.php?cart='.urlencode(serialize($shopping_cart)).'">Next</a>';



          The
          magic_quotes_gpc and magic_quotes_runtime configuration settings affect data being passed to unserialize( ). If magic_quotes_gpc is on, data passed in URLs, POST variables, or cookies must be processed with stripslashes( ) before it's unserialized:


          $new_cart = unserialize(stripslashes($cart)); // if magic_quotes_gpc is on
          $new_cart = unserialize($cart); // if magic_quotes_gpc is off



          If magic_quotes_runtime is on, serialized data stored in a file must be processed with addslashes( ) when writing and
          stripslashes( ) when reading:


          $fp = fopen('/tmp/cart,'w');
          fputs($fp,addslashes(serialize($a)));
          fclose($fp);

          // if magic_quotes_runtime is on
          $new_cart = unserialize(stripslashes(file_get_contents('/tmp/cart')));
          // if magic_quotes_runtime is off
          $new_cart = unserialize(file_get_contents('/tmp/cart'));



          Serialized data read from a database must also be processed with stripslashes( ) when magic_quotes_runtime is on:


          mysql_query(
          "INSERT INTO cart (id,data) VALUES (1,'".addslashes(serialize($cart))."')");

          $r = mysql_query('SELECT data FROM cart WHERE id = 1');
          $ob = mysql_fetch_object($r);
          // if magic_quotes_runtime is on
          $new_cart = unserialize(stripslashes($ob->data));
          // if magic_quotes_runtime is off
          $new_cart = unserialize($ob->data);



          Serialized data going into a database always needs to have addslashes( ) called on it (or, better yet, the database-appropriate escaping method) to ensure it's saved properly.


          When you unserialize an object, PHP automatically invokes its
          __wakeUp( ) method. This allows the object to reestablish any state that's not preserved across serialization, such as database connection. This can alter your environment, so be sure you know what you're unserializing. See Recipe 7.18 for more details.




          5.7.4. See Also


          Recipe 10.9 for information on escaping data for a database.













          Section 11.17. What we need is a way to select descendants










          11.17. What we need is a way to select descendants



          What we're really missing is a way to tell CSS that we want to only select elements that descend from certain elements, which is kinda like specifying that you only want your inheritance to go to the children of one daughter or son. Here's how you write a descendant
          selector.



          divHere's the parent element. h2Leave a space between the parent name and the descendant
          name.
          And here's its descendant.{
          color:
          This rule says to select any <h2> that is a descendant of a <div>.black;Write the
          rest of your rule just like you always do.

          }





          html

          body

          h1 h2 div id="elixirs"

          h2 h3Here's what this rule selects in the lounge. h3 h3



          Now the only problem with this rule is that if someone created another <div> in the "lounge.html" file, they'd get black <h2> text, even if they didn't want it. But we've got an id on the elixirs <div>, so let's use it to be more specific about which descendants we want:



          #elixirsNow the parent element is the element with the id elixirs.h2And
          here's its descendant.
          {
          color:
          This rule says to select any <h2> that is a descendant of an
          element with the id "elixirs".
          black;
          }





          html

          body

          h1 h2 div id="elixirs"

          h2This rule selects the same element. But it's more specific, so
          if we added another <div> with an <h2> to the page, that's okay because this rule
          selects only <h2>s in the elixirs <div>.
          h3 h3 h3




          Sharpen your pencil


          Your turn. Write the selector that selects only <h3> elements inside the elixirs <div>. In your rule, set the color property to #d12c47. Also label the elements in the graph below that are selected.



          html

          body

          h1 h2 div id="elixirs" div id="calendar"

          h2 h3 h3 h3 h1 h2 h3















          A Call to Action













          A Call to Action

          Reflecting, once again, upon the need to nurture a more integrated approach to IT-enabled change, some writers contend that the study of organisational processes of change cannot be conducted well without affecting their very nature and therefore advocate the legitimacy of action-oriented philosophies to guide the research process (Argyris, Putnam, & Smith, 1985; Gummesson, 1991; Schein, 1991). In this regard, action research has received significant attention (Chakravarthy & Doz, 1992; Checkland, 1981; Huber & Van de Ven, 1995; Van de Ven, 1992) while clinical inquiry, a contemporary development of action research, has received relatively limited attention (Coghlan & McDonagh, 2001; McDonagh & Coghlan, 2000).


          Advocating the legitimacy of action research Chakravarthy and Doz (1992) contend that organisational processes cannot be researched well without possibly affecting their very nature. "Rather than ignore the issue or only harp upon the occasional consulting dimension to process research, we believe action research should gain more legitimacy" (p. 10). Such advocacy is congruent with the core characteristics of action research, a theme that will be addressed later (Argyris et al., 1985; Eden & Huxham, 1996; Susman & Evered, 1978).


          In a similar vein, Van de Ven (1992) contends that embracing an action research approach implies:




          "significant investigator commitment and organisational access, which few investigators have achieved to date. One reason why gaining organisational access has been problematic is because investigators seldom place themselves into the manager's frame of reference to conduct their studies. Without observing a change process from a manager's perspective, it becomes difficult, if not impossible, for an investigator to understand the dynamics confronting managers who are involved in a strategic change effort, and thereby generate new knowledge that advances theory and practice" (p. 181).



          The appropriateness of action research in the study of organisational processes of change has been forcefully argued over the years (Argyris et al., 1985; Gummesson, 1991).











          About the Author










          About the Author

          Elliotte Rusty Harold is originally from New Orleans, to which he returns periodically in search of a decent bowl of gumbo. However, he currently resides in the Prospect Heights neighborhood of Brooklyn with his wife, Beth, and cats Charm (named after the quark) and Marjorie (named after his mother-in-law). He's an adjunct professor of computer science at Polytechnic University, where he teaches Java, XML, and object oriented programming. His Cafe au Lait web site (http://www.cafeaulait.org) is one of the most popular independent Java sites on the Internet, and his spin-off site, Cafe con Leche (http://www.cafeconleche.org), has become one of the most popular XML sites. He's currently working on the XOM library for XML, the Jaxen XPath engine, and the Amateur media player. His previous books include Java Network Programming (O'Reilly) and Processing XML with Java (Addison-Wesley).












          Section C.1. Getting Fluent in the Local SQL








          Appendix C. Hibernate SQL Dialects


          Getting Fluent in the Local SQL




          C.1. Getting Fluent in the Local SQL


          Hibernate ships with detailed support for many[11] commercial and free relational databases. While
          most features will work properly without doing so, it's
          important to set the hibernate.dialect configuration
          property to the right subclass of org.hibernate.dialect.Dialect, especially
          if you want to use features like <native>
          or <sequence> primary key generation or
          session locking. Choosing a dialect is also a very convenient way of
          setting up a whole raft of Hibernate configuration parameters you'd
          otherwise have to deal with individually.

          [11] I never expected to bump into Caché again, having left the world
          of health care software to work in Java….


          Database systemAppropriate hibernate.dialect setting
          Caché 2007.1
          org.hibernate.dialect.Cache71Dialect
          DB2
          org.hibernate.dialect.DB2Dialect
          DB2 AS/400
          org.hibernate.dialect.DB2400Dialect
          DB2 OS390
          org.hibernate.dialect.DB2390Dialect
          Derby
          org.hibernate.dialect.DerbyDialect
          Firebird
          org.hibernate.dialect.FirebirdDialect
          FrontBase
          org.hibernate.dialect.FrontbaseDialect
          H2
          org.hibernate.dialect.H2Dialect
          HSQLDB
          org.hibernate.dialect.HSQLDialect
          Informix
          org.hibernate.dialect.InformixDialect
          Ingres
          org.hibernate.dialect.IngresDialect
          Interbase
          org.hibernate.dialect.InterbaseDialect
          JDataStore
          org.hibernate.dialect.JDataStore
          Mckoi SQL
          org.hibernate.dialect.MckoiDialect
          Mimer SQL
          org.hibernate.dialect.MimerSQLDialect
          Microsoft SQL Server
          org.hibernate.dialect.SQLServerDialect
          MySQL (versions prior to 5.x)
          org.hibernate.dialect.MySQLDialect
          MySQL (version 5.x and later)
          org.hibernate.dialect.MySQL5Dialect
          MySQL (prior to 5.x, using InnoDB
          tables)

          org.hibernate.dialect.MySQLInnoDBDialect
          MySQL (prior to 5.x, using MyISAM
          tables)

          org.hibernate.dialect.MySQLMyISAMDialect
          MySQL (version 5.x, using InnoDB
          tables)

          org.hibernate.dialect.MySQL5InnoDBDialect
          Oracle (any version)
          org.hibernate.dialect.OracleDialect
          Oracle 8i
          org.hibernate.dialect.Oracle8iDialect
          Oracle 9i or 10g
          org.hibernate.dialect.Oracle9Dialect
          Oracle 10g only (use of ANSI join
          syntax)

          org.hibernate.dialect.Oracle10gDialect
          Pointbase
          org.hibernate.dialect.PointbaseDialect
          PostgreSQL
          org.hibernate.dialect.PostgreSQLDialect
          Progress
          org.hibernate.dialect.ProgressDialect
          SAP DB
          org.hibernate.dialect.SAPDBDialect
          Sybase (or MS SQL Server)
          org.hibernate.dialect.SybaseDialect
          Sybase 11.9.2
          org.hibernate.dialect.Sybase11Dialect
          Sybase Anywhere
          org.hibernate.dialect.SybaseAnywhereDialect
          Teradata
          org.hibernate.dialect.TeradataDialect
          TimesTen 5.1
          org.hibernate.dialect.TimesTenDialect
          Unisys 2200 RDMS
          org.hibernate.dialect.RDMSOS2200Dialect



          If you don't see your target database here, check whether support
          has been added to the latest Hibernate release. Most of the dialects are
          listed in the SQL
          Dialects
          section of the Hibernate reference
          documentation. If that doesn't pan out, see if you can find a third-party
          effort to support the database, or consider starting your own!









          DB2 Architecture



          [ Team LiB ]





          DB2 Architecture


          DB2 uses semaphores and shared memory for interprocess communication. This has enabled DB2 to be the first Relational Database Management System (RDBMS) to support the new InfiniBand storage architecture.


          The DB2 Process Model (Figure 2.3) consists of clients or applications connecting to DB2 databases where a coordinating agent is assigned to process all requests for a particular application. Subagents can be assigned if using the ESE Database Partitioning Feature (DPF) or intrapartition parallelism. Bufferpools are used to store frequently accessed data and I/O servers process prefetch requests. I/O cleaners flush dirty pages from the bufferpools to disk. The logger process records changed information and at the appropriate time writes committed changes to disk in coordination with the bufferpool manager.


          Figure 2.3. Overview of DB2 architecture.


          Communication protocols supported are TCP/IP (the most common), NETBIOS, Named Pipes, and APPC. Work in DB2 is accomplished by Engine Dispatchable Units (EDUs). In UNIX, EDUs are implemented as processes and in Windows, EDUs are implemented as threads. The ps (process status) command can be used on UNIX to display DB2 processes. Use the db2_local_ps command to return a list of all DB2 processes to standard output. Operating system processes will not be shown; only DB2 processes will be shown, which makes it easier to quickly see what DB2 processes are running. On Windows, DB2 threads can be monitored using the TASK MANAGER. See Table 2.3 for a partial list of DB2 processes on UNIX platforms. (For a detailed list, refer to the "Everything You Wanted to Know About DB2 VDB Processes," DB2 Developer Domain tech article by Snow and R. Chung).


          Table 2.3. DB2 AIX Processes

          Process Name

          Function

          db2agent

          Coordinator agent

          db2agentp

          Subagent processes

          db2pfchr

          Prefetching

          db2pclnr

          Page cleaning

          db2loggr

          Log reader

          db2loggw

          Log writer

          db2logts

          Tablespace logger

          db2glock and db2dlock

          Global and local deadlock detector, one per partition

          db2fmp

          Fenced process for UDFs and SPs

          db2reorg

          Online inplace reorg process

          db2sysc

          DB2 engine

          db2tcpcm

          TCP communication manager

          db2ipccm

          IPC communication manager


          Coordinating Agent (db2agent)


          DB2 assigns a coordinating agent for each connected application. Coordinating agents coordinate the work associated with an application. Coordinating agents create subagents in a partitioned database environment or if intraparallel is enabled, as well as the work of subagents.


          Subagents (db2agntp)


          Subagents are created by coordinating agents to do work in parallel. Subagents are used in partitioned database environments or if the intra_parallel DBM CFG parameter is enabled.





            [ Team LiB ]



            Chapter 5: The Problem of Common Method Variance in IS Research












            Chapter 5: The Problem of Common Method Variance in IS Research


            Amy B. Woszczynski
            Kennesaw State University, USA



            Michael E. Whitman
            Kennesaw State University, USA



            Copyright � 2004, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited.




            Abstract



            Many IS researchers obtain data through the use of self-reports. However, self-reports have inherent problems and limitations, most notably the problem of common method variance. Common method variance can cause researchers to find a significant effect, when in fact, the true effect is due to the method employed. In this chapter, we examined published research in leading information systems (IS) journals to determine if common method variance is a potential problem in IS research and how IS researchers have attempted to overcome problems with method bias. We analyzed 116 research articles that used a survey approach as the predominant method in MIS Quarterly, Information Systems Research, and Journal of Management Information Systems. The results indicate that only a minority of IS researchers have reported on common method variance. We recommend that IS researchers undertake techniques to minimize the effects of common method variance, including using multiple types of respondents, longitudinal designs, and confirmatory factor analysis that explicitly models method effects.











            Introduction













            Introduction


            The existence and significance of cognition in organizations and its influence on patterns of behaviour in organizations and organizational outcomes are increasingly accepted in information systems (IS) research (Barley, 1986; DeSanctis & Poole, 1994; Griffith, 1999; Griffith & Northcraft, 1996; Orlikowski & Gash, 1992, 1994). However, assessing the commonality and individuality in cognition and eliciting the subjective understanding of research participants either as individuals or as groups of individuals remain a challenge to IS researchers (Orlikowski & Gash, 1994). Various methods for studying cognition in organizations have been offered—for example, clinical interviewing (Schein, 1987), focus groups (Krueger, 1988), and discourse-based interviewing (Odell, Goswami, & Herrington, 1983). This article proposes that cognition applied to making sense of IT in organizations can also be explored using Kelly's (1955) personal construct theory and its methodological extension, the repertory grid (RepGrid). The RepGrid can be used in IS research for uncovering the constructs research participants use to structure and interpret events relating to the development, implementation, use, and management of IS in organizations.


            In the context of this chapter, cognition is considered to be synonymous with subjective understanding, "the everyday common sense and everyday meanings with which the observed human subjects see themselves and which gives rise to the behaviour that they manifest in socially constructed settings" (Lee, 1991, p. 351). Research into cognition in organizations investigates the subjective understanding of individual members within the organization and the similarities and differences in the understandings among groups of individuals (Daniels, Johnson, & de Chernatony, 1994; Jelinek & Litterer, 1994; Porac & Thomas, 1989). In IS research, it is the personal constructs managers, users, and IS professionals use to interpret and make sense of information technology (IT) and its role in organizations.


            In this chapter, we discuss the personal construct theory (Kelly, 1955) and in particular the myriad of ways the RepGrid can be employed to address specific research objectives relating to subjective understanding and cognition in organizations. It illustrates, from a variety of published studies in IS and management, the flexibility of the RepGrid to support both qualitative and/or quantitative analyses of the subjective understandings of research participants. We hope that this will initiate further discussions and responses to calls for more cognitive emphasis in organizational research (Langfield-Smith, 1992; Stubbart, 1989; Swan, 1997).


            We are not implying in this chapter that the personal construct theory and the RepGrid are the best or the only theory and method available to the IS researcher in the study of cognition in organizational settings. There are certainly other cognitive theories and mapping methods applied in organizational research (Huff, 1990; Sims, Gioia, & Associates, 1986). These produce different types of cognitive maps capable of depicting different perspectives of cognition. Our focus is on the subjective understandings of organizational members—that is, personal constructs applied to everyday sense-making. Kelly's (1955) theory and method are widely accepted in the study of cognitive constructs and understandings of individuals in fields from psychology to management.


            The primary audience is IS researchers who are interested in investigating the cognitive perspective in the development, implementation, use, and management of IS in organizations. It is anticipated that by examining the subjective understandings in organizations, researchers will better understand the interplay of various cognitive dimensions in the organizational context.


            The chapter continues with an elaboration of Kelly's (1955) personal construct theory and the relevance of this theory and its method to IS research and practice. The basics of the repertory grid—its components, various design decisions, and its implications—are then discussed. This discussion is followed by considering how the repertory grid is used in a variety of published studies in the IS and management fields. These examples illustrate the flexibility of the technique to support both qualitative and/or quantitative approaches as related to the specific research objectives. Finally, the chapter concludes with the suggestion that IS researchers consider the adoption of the repertory grid as an appropriate technique in addressing investigations into cognition— the subjective understandings of individuals in organizations.











            Graphics



            [ LiB ]

            Graphics

            What do 3DS Max, Lscript, Lightwave, Alice, Maya, Blender, Animation Master, TrueSpace, RenderMan, and Poser all have in common? Well, besides being graphic programs and 3D applications, they are all Python scripting interfaces. Python is ideal for the struggling artist; it's able to link up to industry gear and is perfect for creating quick custom tools or automating repetitive tasks.

            Alice

            Alice is a tool for developing three-dimensional graphics, built around the concept of "3D for everyone." Most 3D engines require the programmer to know extensive trigonometry, vector algebra, and other painful math. Alice is designed to provide non-programmers with access to 3D programming and interactive worlds. One of the things that makes Alice powerful is that it has a very straightforward, easy-to-learn GUI (shown in Figure 5.4) for placing, sizing, tweaking, and animating three-dimensional objects and spaces.

            Figure 5.4. The Alice GUI


            Alice is open source and made available by its current developers and copyright holders, the Stage Three Research Group at Carnegie Mellon University, and can be found online at http://www.alice.org.

            The worlds and content created with Alice are freely distributable, as long as the stipulations in the license are followed. The Alice project initially began at the University of Virginia, and over the years has received support in the form of grants from DARPA, Intel, Microsoft, NSF, Pixar, Chevron, NASA, the Office of Naval Research, Advanced Network and Service Inc., ONR, and the Python community itself.

            Currently Alice supports two-dimensional graphic imports (via drag and drop or through its built-in billboard) and .ase files, which are ASCII Scene Export files used for exporting 3D wire-frames on several 3D modelers (including 3D Studio Max). Alice is also capable of importing music and sounds by using MP3 files. The engine comes equipped with hundreds of models and sounds pre-built and packaged for the newbie.

            Alice actually has draggable programming constructs (for example, if/else statements and loops) that can be used to set the behavior of the models. Underneath the GUI is a complete language that supports methods, arrays, lists, functions, recursion, and so on.

            Alice has recently gone through a complete re-development, and work is ongoing to allow Alice to export and import more formats and run on more platforms. Originally, Alice was completely Pythonthe core, the code, the whole enchilada. With the recent major rewrite (which has been ongoing since 1999), much of the software has been rewritten in Java. However, the engine is still scriptable via Jython.

            Jython is an implementation of Python. However, Jython is written completely in Java, and is integrated into Sun Microsoft's Java 2 J2EE platform. This means Jython has all the dynamic object-oriented features of the Python language, and also runs on any Java platform.

            In order to implement Python/Jython scripting in Alice, you need to first enable it. You can turn on Jython scripting under the Preferences menu. Select Edit, Preferences, Enable Jython Scripting, as shown in Figure 5.5.

            Figure 5.5. Enabling Jython scripting in Alice's GUI


            Once scripting is enabled, every object within the Object Tree (the top left-hand window, which includes any instance of three-dimensional objects, including the world itself) is script editable with a right-click of the mouse, or through one-line scripts via a "go" executable line (see Figure 5.6). You can also access scripts when editing methods (Alice has a built-in method editor) with two draggable tiles called Script and Script-Defined Response.

            Figure 5.6. Editing a penguin object script from the object tree


            The Script tile allows you to type in code that will be run when that script method is run in the Alice engine. The Script-Defined Response is used to fire pre-composed Alice animations.

            Objects in Alice can be called, using their names, through scripts, and their properties and variables are accessed just like member variables:


            Penguin.isShowing = false

            All of this is pretty powerfulnot only can you script objects via Python/Jython, but with Jython you also have access to the entire Java API. The scripts can also call built-in Alice animations and Alice's "RightNow" methods, like those outlined in Table 5.4.

            Table 5.4. Alice's RightNow Methods

            Method

            What it does

            DoInOrder()

            Runs a series of animations

            IfElseInOrder()

            Runs animation list if the condition is met for if/else statements

            isShowing()

            Sets subject to be visible or not visible

            ForEachInOrder()

            Iterates through a list

            MoveAnimation()

            Moves subject

            moveRightNow()

            Moves subject immediately if given direction and amount

            PositionAnimation()

            Sets subject position in world

            ResizeAnimation()

            Resizes subject

            resizeRightNow()

            Resizes subject immediately

            rotateRightNow

            Rotates on given axis immediately

            setOrientationRightNow

            Sets subject's orientation via 3D matrix immediately

            SoundAction()

            Plays given sound at specified volume

            TurnAnimation()

            Rotates subject

            turnRightNow()

            Rotates subject immediately given amount

            WaitAction()

            Waits for given duration

            WhileLoopInOrder()

            Runs through animation list while condition is true


            These methods (and many otherscheck out the Alice2 documentation) can be called on models within Alice, but also on Alice's camera (the "watcher" point of view) and other objects like lights.

            Let's say you wanted to define an animation function in Jython. You can define the animation just like you define any other function:


            def MyAnimation(MyObject):
            return MyAnimation

            In this case, the function MyAnimation will take in MyObject as an argument and send back MyAnimation as the animation series you want the model to execute (assuming that the object will be an Alice model). Now let's set the animation to do something:


            def MyAnimation(MyObject):
            turn = TurnAnimation(MyObject, right, amount=1.0)
            move1 = MoveAnimation(Forward, amount =1.0, duration =1.0)
            move2 = MoveAnimation (Backward, amount=1.0, duration=1.0)
            MyAnimation = DoInOrder(
            MyObject.IsShowing = true,
            move1,
            turn,
            move2,
            )
            return MyAnimation

            You define move1 and move2 to move forward and backwards using Alice's MoveAnimation method. Then you set turn to give the model a spin using TurnAnimation. Finally, you make sure the object is visible with MyObjectIsShowing and run your series of animations.

            AutoManga

            Although now nearly defunct, AutoManga is a solution for digital cell animation. Japanese Manga-style animation is the idea behind AutoManga, and the engine is implemented with Python scripts that call C/C++ extensions for SDL routines. The engine was developed by Terry Hancock, has had a number of other contributors over the years, and originally was to be connected to the Python Universe Builder to handle interactive fiction and use XML for sequencing resource files.

            Much of AutoManga was completed, including lighting effects and the ability to pull a few different formats for background images and animation cells, but the project unfortunately hasn't seen much action in the past year or two. Still, it is a good starting point for frame and cell based Python animation; the developer notes and files are located on Sourceforge, at http://automanga.sourceforge.net/.

            Blender

            Blender is a 3D graphics suite with a tumultuous history. Originally, Blender was a rewrite of the Netherlands animation house NeoGeo's 3D toolset. One of the co-founders of NeoGeo, Ton Roosendaal, also founded a spin-off company called Not another Number (NaN). This company's model was to further develop and market Blender technology. Initially this company faired very well, raising millions of dollars and gaining thousands of customers, but it was hit with hard economic times. In 2001, the company announced bankruptcy and the investors closed down NaN.

            Blender, however, proved to have a strong will to live. Roosendaal started a non-profit foundation and began the "Free Blender" campaign with the idea of opening up Blender to the community as an open-source project. He worked with NaN's investors to agree to a plan wherein the Blender Foundation would be able to purchase the intellectual rights and source code of the Blender engine. Then, to the surprise of everyone, Roosendaal and several ex- NaN employees, with the help and support of Blender's loyal users, managed to raise 100,000 EUR in seven weeks to make the purchase. Blender was free, and continues to be free to this day, supported by developers and used by artists around the world, under the GNU GPL License.

            Blender can be used for 3D modeling, animation, game-engine scripting (in some versions), and rendering. Most useful is Blender's built-in text editor (see Figure 5.7) for Python scripts, which can be used to customize tools, set up animations and effects, and even build sophisticated AI control over lighting and game objects.

            Figure 5.7. Blender's text editor readily opens a Blender Python script


            Blender offers a number of Python modules (shown in Table 5.5) to use in scripting. Some of them are still being ported into the newest version of Blender as of this writing.

            Table 5.5. Blender Python Modules

            Module

            Description

            Porting Complete

            Blender

            The main Blender module

            yes

            BGL

            The Blender OpenGL module

            yes

            Camera

            The Camera module

            yes

            Draw

            Display module

            yes

            Image

            The Image module

            yes

            IPO

            The IPO animation key module

            no

            Lamp

            The Lamp module

            yes

            Material

            The Material module

            no

            Mesh

            The Mesh module

            no

            Nmesh

            Low level mesh access

            no

            Object

            The Object module

            no

            Scene

            The Scene module

            no

            Text

            The Text module

            yes

            Window

            The Window module

            yes


            To switch to the scripting mode in Blender, press the Shift and F11 keys simultaneously or go to the current Window Type button and choose Text Editor. Click the Browse Datablock button and choose Add New Blender to open a blank .py file. Blender will automatically name the file TX:text; you can change the name by clicking on it and typing in the new name (see Figure 5.8).

            Figure 5.8. Highlighted text controls in Blender


            To test out Blender, start by renaming a text file to MyFile.py, and then import the main Blender module. From that point on you have access to the Blender methods such as Object:


            import Blender
            MyObject=Blender.Object.Get("Some_Object")

            When running scripts on objects in Blender, you would normally have two windows open. One would be a workspace with the object within, and the second would contain the Python script that you would run on the object.

            Let's say you needed to run some complex math on a Mesh or Nmesh in Blender. First you import Mesh or Nmesh:


            import Blender
            from Blender Import Nmesh

            Then grab the mesh object, its name, and its raw data using Object and Nmesh methods:


            MyObject=Blender.Object.Get("Some_Mesh_Object")
            MyMeshName=MyObject[0].data.name
            MyMesh=Nmesh.GetRaw(MyMeshName)

            Finally, run your complex math on each vertex, replace the values in your objects, and have Blender redraw the object:


            for each_vertex In MyMesh.verts:
            # complex math here
            # complex math here
            # complex math here
            Nmesh.PutRaw(MyMesh, MyMeshName)
            Blender.Redraw

            Blender a is an excellent demonstration of the power of open source and open community development. Blender's user base is extremely supportive and creative, and is busily at work at making Blender the best appliance since toasters. You'll find information on Blender at

            • The Blender community site. http://www.blender.or.g

            • The Blender foundation site. http://www.blender.org/bf.

            • The Blender release page. http://www.blender3d.org.

            Nebula

            Nebula is an open-source, 3D, real-time, multi-platform game engine that supports Direct X and OpenGL. The project is brought to us by the game studio Radon Labs, in Berlin. Nebula is actually implemented with C++, but what makes it super-fun is that it is also scriptable with Python, Lua, and Tcl/Tk. I'll talk a bit more about Nebula later on in this book (specifically in the Lua sections).

            Panda3D

            Panda3D is a rendering engine for SGL. The core of the engine is in C++, but Panda3D also provides a Python scripting interface and utility code. I'll talk a bit more about Panda3D in the section on commercial games later in this chapter.

            Poser

            The Poser Pro Pack and Poser 5 come equipped with Python scripting as an available resource for artists; this is mainly used to automate advanced functions in the interface. Python scripts can be accessed from Poser's Window menu, which opens up a Python Scripts dialog box, as shown in Figure 5.9.

            Figure 5.9. Accessing Poser's Python Scripts dialog box


            The dialog box can be used as a placeholder for commonly used scripts. Clicking on a script with the Alt key on a PC or the Control key on a Mac will bring up a text version of the script that you can edit.

            When creating custom scripts, much of the work on Poser is done through the Scene, which is part of the Poser import module:


            # First Import Poser module
            import poser
            # Create a scene
            MyScene = poser.Scene()
            # Then you would do things to the poser scene
            # And at the end re-draw the scene
            Myscene.DrawAll()

            Pretty nifty, huh? Poser actually has a very deep API for interacting with Python; it goes way beyond scenes and comes equipped with pre-defined scripts for you to use. There is also a fairly large knowledge base and plenty of sample scripts within the community.

            Information on Poser can be found at Curious Labs's site, at http://www.curiouslabs.com/products/poser4#productinfo.



              [ LiB ]