Roger Suen's Blog

Coding for a living, and amusement.

ADF - Time Zone for af:convertDateTime

In the last post about configuring WebLogic Server time zone, I mentioned one of reasons you do it is to configure the default time zone for ADF Faces to convert date and time for input and output components. This post will focus on it - how the ADF Faces convertDateTime converter and the af:convertDateTime tag work with the time zone configuration in detail. This is only the first piece of the puzzle. Hopefully, I can put other pieces together to complete the puzzle with another two or more subsequent posts.

Here you can find the Source Code of the sample application or Download ZIP of it.

Image: ADF Samples - Time Zone for af:convertDateTime

To make it easier, I'm using java.util.Date in the discussion and the sample application, and Oracle data type DATE in some figures. The basic idea applies to java.sql.Date, etc.

In Java, the class java.util.Date represents a specific point in time. As per the javadocs for one of its constructors - Date(long date):

Allocates a Date object and initializes it to represent the specified number of milliseconds since the standard base time known as "the epoch", namely January 1, 1970, 00:00:00 GMT.

Clearly, the class Date represents a determinate point in time which is in the time zone GMT. To display a Date, we need a converter or a formatter to turn the Date into a String which represents the "wall clock time" local to a specific time zone. When the target time zone changes, the resulting String or the "wall clock time" could change, but the value of the Date does not change in this process.

The following figure illustrates how the date and time data passes through a typical ADF application:

Image: Date Data Handling

  • The ADF Faces component accepts the user input as a String value and convert it into a Java Date value with a DateTimeConverter.
  • The Oracle JDBC driver passes the Date value into the database as an Oracle DATE value.
  • For output, the JDBC driver retrieves the Oracle DATE value out of the datebase as a Java Date value.
  • The ADF Faces component display the date and time after converts the Java Date value into a String value with a DateTimeConverter.

I'll talk about the JDBC part in my next post, and here will focus on how the time zone configuration comes into play in the view part:

Image: View Layer Time Zone

When an ADF Faces component works with a DateTimeConverter, a java.util.TimeZone object can be configured with it, as shown in the following code snippet from the sample application:

<af:inputText 
        id="it_dt"
        label="Date Time: "
        value="#{userBean.dateTime}" autoSubmit="true">
    <af:convertDateTime 
        pattern="yyyy-MM-dd HH:mm" 
        timeZone="#{userBean.inputTimeZone}"/>
</af:inputText>

Here's the description from ADF RichClient API - <af:convertDateTime> for the timeZone attribute:

Time zone in which to interpret any time information in the date string. If not set here, picks this value from trinidad-config.xml configuration file. If not defined there, then it is defaulted to the value returned by TimeZone.getDefault(), which is usually server JDK timezone.

When the component is used for user input, the TimeZone object specifies the source time zone in which the date string should be interpreted, and convert the String input value into a Date value which is in the destination time zone GMT. When the component is used for output, the TimeZone object specifies the destination time zone, and the Date value is converted into a String representing the local date and time in the time zone specified by the TimeZone object.

You can configure the time zones in three levels:

  • System-level time zone
  • Application-level time zone
  • Converter-level time zone

The system-level time zone can be configured as described in my last post - Configuring the Time Zone with WebLogic Server. The application-level time zone can be configured like this as in the sample application:

<trinidad-config xmlns="http://myfaces.apache.org/trinidad/config">
  <time-zone>#{applicationBean.applicationTimeZone}</time-zone>
</trinidad-config>

The converter-level time zones can be configured with business-specific time zones or user preference time zones according to your application requirement. For example, in an application displaying a flight's departure time and arrival time, two different time zones for the departure airport and arrival airport respectively can be used. That's the business-sepcific time zone approach. You can also support the user preference time zones in this case as an user-friendly feature.

This post covers how the time zones participate in the date values processing in the ADF Faces view layer. In the next post, I'll introduce what happens when the date values are accessed with the Oracle JDBC driver.

Special Note for the ADF prior to 12c

In the ADF 11g, the timeZone attribute of the af:convertdateTime is documented as this:

Time zone in which to interpret any time information in the date string. If not set here, picks this value from adf-faces-config.xml configuration file. If not defined there, then it is defaulted to GMT.

Series on Time Zone

Sample Application

Environment

  • Oracle Alta UI
  • JDeveloper 12.1.3.0.0 Build JDEVADF12.1.3.0.0GENERIC_140521.1008.S
  • Safari Version 8.0
  • Mac OS X Version 10.10

Resources

Configuring the Time Zone with WebLogic Server

In order to properly handle the date and time data in your ADF applications, you probably need to configure the WebLogic Server time zone, for the reasons including but not limited to:

  • Configure the default time zone for <af:convertDateTime> used by input and output components.
  • Configure the time zone that affects how the Oracle JDBC driver handles the date and time data.

This post introduces how to configure the time zone with an integrated or a standalone WebLogic Server, or the ADF Model Tester.

Integrated WebLogic Server and ADF Model Tester

When you are running and testing your application using an Integrated WebLogic Server, or testing your model project with the ADF Model Tester you can configure the time zone by adding the following system property to the Java Options on the Launch Settings page in the Edit Run Configuration window:

-Duser.timezone=UTC

To do this:

  1. Select the project in the Applications window.
  2. From the main menu, choose Application > Project Properties
  3. Select Run/Debug.
  4. Choose to Edit the selected run configuration (a default run configuration is created for each new project).
  5. Add the time zone system property to Java Options

Image: Edit Run Configuration

The configuration will apply when the Java program is launched from JDeveloper, for example the Integrated WebLogic Server and the ADF Model Tester. To confirm it, you can look for the system property in the Log window after the program is launched:

Image: Log Window

Another way to configure the Integrated WebLogic Server time zone is to set it by modifying the properties of the integrated application server:

  1. In the Application Servers window, right-click the integrated application server (the default instance is called IntegratedWebLogicServer), choose Properties.
  2. Select Launch Settings tab.
  3. Add the time zone system property to Java Options

Image: Application Server Properties

Please note that the Launch Settings of the Application Server Properties are used only when the server starts with no application selected (effectively meaning no application is open in the Applications window).

Caution: when the server starts with no application selected, and then open the application and run it against the server, the Launch Settings defined in the Application Server Properties will be used; the Java Options defined in the run configuration of the project will be ignored.

Standalone WebLogic Server

To configure the time zone with a standalone WebLogic Server instance, if you use a WebLogic Server script to start servers, you can edit the JAVA_OPTIONS in the script to set the system property, see "Specifying Java Options for a WebLogic Server Instance"; if you use the Node Manager to start servers, you can set Java Options for each server instance in the Oracle WebLogic Server Administration Console, see "Set Java options for servers started by Node Manager".

Series on Time Zone

Resources:

BEA-141297 - Could not get the server file lock

While starting the WebLogic Administration Server or a Managed Server, you might encounter the following error that prevents the server from starting up:

<Feb 9, 2015 1:40:34 PM CST> <Info> <Management> <BEA-141297> <Could not get the server file lock. Ensure that another server is not running in the same directory. Retrying for another 60 seconds.>

This is because the server lock file is left behind for some reason from the last run. To fix this error:

  • Navigate to the server-specific tmp directory under your $DOMAIN_HOME directory, in my case, for example: ~/Oracle/config/domains/base_domain/servers/AdminServer/tmp for the Administration Server or ~/Oracle/config/domains/base_domain/servers/wls_server_1/tmp for one of the Managed Servers;
  • Delete the lock file for the server instance, AdminServer.lok for the Administration Server or wls_server_1.lok for the mentioned Managed Server.
  • Start the server instance again.

WebLogic - Native library for the Node Manager

After the WebLogic domain configuration is complete, while starting the Node Manager, you might encounter an error as reported below:

WARNING: NodeManager native library could not be loaded to write process id
java.lang.UnsatisfiedLinkError: no nodemanager in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
    at java.lang.Runtime.loadLibrary0(Runtime.java:849)
    at java.lang.System.loadLibrary(System.java:1088)
    at weblogic.nodemanager.util.UnixProcessControl.<init>(UnixProcessControl.java:25)
    at weblogic.nodemanager.util.ProcessControlFactory.getProcessControl(ProcessControlFactory.java:23)
    at weblogic.nodemanager.server.NMServer.writeProcessId(NMServer.java:253)
    at weblogic.nodemanager.server.NMServer.writePidFile(NMServer.java:230)
    at weblogic.nodemanager.server.NMServer.<init>(NMServer.java:121)
    at weblogic.nodemanager.server.NMServer.main(NMServer.java:505)
    at weblogic.NodeManager.main(NodeManager.java:31)

<Feb 9, 2015 10:19:50 AM CST> <SEVERE> <Fatal error in NodeManager server: Native version is enabled but NodeManager native library could not be loaded>

This is because by default, Oracle enables native libraries for the operating system to be used by the Node Manager, even when the native version is actually not provided for the specific operating system. Here's the statement from the Oracle documentation Administering Node Manager for Oracle WebLogic Server:

Oracle provides native Node Manager libraries for Windows, Solaris, Linux on Intel, Linux on Z-Series, and AIX operating systems.

To fix this error in an unsupported operating system (like Mac OS, in my case), you can simply disable the native version support by updating the configuration setting in the nodemanager.properties file. The file only gets created until the Node Manager has started up once. It's typically in the $DOMAIN_HOME/nodemanager directory. In the file, find the the following setting:

NativeVersionEnabled=true

Update it to be as follow:

NativeVersionEnabled=false

Now, you can start the Node Manager:

nohup ./startNodeManager.sh > nm.out&

Check out the log file, the warning about the NodeManager native library could not be loaded is still there, but the Node Manager should start up successfully after printing out the current configuration settings.

Resources:

ADF - The markScopeDirty() method for ADF memory scopes

We have a coding convention of using memory-scoped data in our ADF Faces development - always manage memory-scoped data, such as parameters or state as properties of managed beans. Comparing to putting the data directly into memory scopes, the convention has the following benefits:

  • allows proper documentation
  • allows validation, initialization and logging
  • helps understanding and maintenance.

There is a caveat however, when applying the coding convention to the ADF Faces-specific scopes - page flow scope and view scope.

The ADF Controller uses session scoped objects to hold the page flow scope and view scope. When the high availability (HA) mode is on, the application server will serialize any objects in the session scope and replicate the serialized data within the cluster. To void blind serialization of the page flow scope and view scope, ADF optimizes the process - you have to make sure the framework is aware of changes to one of these ADF scopes by marking the scope as dirty.

If the scope is modified by calling its put(), remove(), or clear() methods, the framework will handle the marking, such that you don't have to care about it when you put data directly into the scope. When the properties of managed beans are used as suggested in our coding convention, then the framework must be notified of changes to the properties. This can be done by the code like this:

Map<String, Object> viewScope = AdfFacesContext.getCurrentInstance().getViewScope();
ControllerContext ctx = ControllerContext.getInstance();
ctx.markScopeDirty(viewScope);

Repeating this for every property of managed beans is surely bad. Hard-coding the scope where the managed beans are put into does not look like a good idea either. These concerns lead to the following solution in the base class for all managed beans:

public static final String VIEW_SCOPE = "view";
public static final String PAGE_FLOW_SCOPE = "pageFlow";
private String scope;

public String getScope() {
    return scope;
}

public void setScope(String scope) {
    if (scope == null || VIEW_SCOPE.equals(scope) ||
        PAGE_FLOW_SCOPE.equals(scope)) {
        this.scope = scope;
    } else {
        throw new IllegalArgumentException("Unsupported ADF scope: " +
                                           scope);
    }
}

public void markScopeDirty() {
    if (this.scope == null) {
        return; // no scope is specified, skip
    }

    String prop =
        ControllerConfig.getCurrentProperty(ControllerProperty.ADF_SCOPE_HA_SUPPORT);
    if (!"true".equals(prop)) {
        return; // support not enabled, skip
    }

    Map<String, Object> scopeObject = null;
    if (VIEW_SCOPE.equals(this.scope)) {
        scopeObject = AdfFacesContext.getCurrentInstance().getViewScope();
    } else if ("pageFlowScope".equals(this.scope)) {
        scopeObject =
            AdfFacesContext.getCurrentInstance().getPageFlowScope();
    } else {
        // should never happen, setScope() has done the validation
    }

    ControllerContext.getInstance().markScopeDirty(scopeObject);

    System.out.println("DEBUG: markScopeDirty for HA [scope=" + scope +
                       "]");
}

Now it's much easier to mark the scope as dirty, simply call the markScopeDirty() method in the property setter method of the managed bean.

How to set the scope that should be marked? It's recommended to use the scope property as a managed property, e.g. set the scope property right at the point where and when the managed bean is declared to be in that scope:

Image: The scope managed property

Resources:

Newer Posts Older Posts