This site has been created to log references to
computer languages. If you have any suggested
additions, please contact
me.
From ACM's TechNews, July 25, 2005
"A First Programming Language for IT Students"
University of Southampton (ECS) (07/20/05); Gee, Quintin; Wills, Gary; Cooke, Eric
- Quintin Gee, Gary Wills, and Eric Cooke of the University of Southampton's Learning Technologies Group discuss what programming language should initially be taught to IT students, taking into account substantial differences between IT students and traditional computer science students. The authors note that IT students tend to be either direct IT program registrants, transfers from computer science programs, mature students (23 years and up) with industrial experience, or transfers from other institutions or from overseas. Almost all IT students seek computer industry careers and titles such as IT manager, operations manager, project manager, project leader, team leader, or technical manager. Gee, Wills, and Cooke argue that including programming in an IT degree program can help students acquire an understanding of the uses and limitations of computer technology, but the method of teaching must cover programming and program design without emphasizing a coding language. Choosing the programming language to train students in is a challenge, given that there are so many different languages that carry both advantages and drawbacks. One suggestion is to teach programming using the basic components of an existing production language such as Java, Delphi, or Visual Basic, which most students have heard of. The University of Southampton offers a programming module in the second semester of the IT degree program's first level that is designed to make students proficient in algorithmic problem solving; knowledgeable of basic programming principles; aware of the software development process; and experienced with a development environment and writing programs in microcosm. The authors conclude that IT students generally favor a language they are familiar with that also boasts a simple IDE and a real programming language.
Click Here to View Full Article
Love That 'Legacy'
News Story by Gary H. Anthes
ComputerWorld, JULY 04, 2005
- Like it or not, old code is still around, and it needs special care. Read the article
From ACM's TechNews, July 20, 2005
"The Resurgence of Mainframes?"
InternetNews.com (07/18/05); Boulton, Clint
- Reports on the death of the mainframe's viability are exaggerated, although the population of skilled mainframe programmers is shrinking thanks to the retirement of baby boomer mainframe specialists, and the fact that most computer science graduates are being trained on Windows or Unix operating systems. Sun Microsystems' Don Whitehead says the shortage of mainframe experts, along with cost issues and a paucity of mainframe applications to satisfy changing business requirements, is spurring customers to abandon mainframes. IBM is trying to reverse the decline in mainframe skills through its Academic Initiative zSeries program, which provides zSeries mainframes, software, and training to 150 participating universities in an effort to boost mainframe proficiency among students. IBM wants to have twice as many schools involved in the program by year's end, and the initiative's goal is to have 20,000 mainframe-savvy IT professionals in the market by the end of the decade. This is vital if the company is to continue selling its zSeries systems, because clients may choose smaller Unix or Windows-based systems if there are not enough competent mainframe programmers. Professor David Douglas of the University of Arkansas' Walton School of Business says getting students interested in mainframes is a challenge because "they've grown up with a mouse in their hand and a PC." Computer science people use Unix almost exclusively, while the majority of business schools employ Microsoft's Windows environment.
Click Here to View Full Article
From ACM's TechNews, July 18, 2005
"Father of Java Talks Futures"
eWeek (07/15/05); Taft, Darryl K.
- In a recent interview, Sun Microsystems fellow and Java creator James Gosling outlines his thoughts on the future of the language. Central to Sun's work on Java has been integration with other languages, including one complex, ongoing project that has numerical computing and multithreading capabilities as its focus. Gosling notes that numeric computing has been researched for decades, and although Java is far along in terms of multithreading, working with complex numerical applications is much harder. He says there are a lot of "hard problems in building something that scales to hundreds of thousands of threads, and really does the kind of concurrency that people need in numeric computing." Gosling expresses his desire to include small object support, an innovative method for cross-package references, and enhanced rendering capabilities in the next version of Java. In addressing the issue of open source, Gosling cites Sun's testing clauses as the major impediment toward a universal embrace of Java from the open source community, though he does not feel that open source initiatives from other companies will hurt Sun's market share. He credits Ajax with helping to mitigate the interoperability problem of multiple programmers writing in JavaScript. Gosling is also optimistic about Java's ability to interoperate with dynamic languages through Sun's Coyote project, which he sees as executing his vision of creating language that is at once dynamic and highly functional.
Click Here to View Full Article
"Open Source Rules Campus Programs"
Business Journal of Portland (07/17/05); Earnshaw, Aliza
- Web developers in Portland, Ore., are finding it difficult to hire computer science graduates with experience in Microsoft's .Net Web site development environment, because most college graduates are trained on open-source systems, according to Mark Brody of Opus Creative. Portland State University computer science department director Cindy Brown says, "We teach our students the principles that we think will hold for the long term, that will help them learn over the long term," while businesses desire people who can dive head-first into the rapid-pace world of commercial work. She believes university programs can train people to be adaptable, which would better prepare them for the rigors of the business sector. Pop Art President Steve Rosenbaum notes that .Net training is offered by two-year colleges, but companies such as his generally prefer people with four-year degrees because their hiring policies require programmers proficient in communication and customer relations. Brown says her school teaches on open-source operating systems because it enables students to understand the software's inner mechanisms and because it costs nothing. Rosenbaum says large, traditional companies with complicated Web sites prefer Microsoft's proprietary technology, while Brody says there is no accountability for open-source software since it is generated and tweaked by a community of enthusiasts.
Click Here to View Full Article
From ACM's TechNews, July 11, 2005
"Love That 'Legacy'"
Computerworld (07/04/05) P. 27; Anthes, Gary H.
- Defined variously as obsolete and reliable, and referred to with both admiration and scorn as Cobol or mainframe code, legacy systems still occupy an important position in today's programming landscape. Despite poor documentation and the melting pot effect that has come from many people having tinkered with the language over the years, legacy programs are the bread and butter of many IT operations. Tower Records has held onto its Cobol-based point-of-sale software system that was written in the mid 1980s: Tower's programmers have updated the code several times, effectively rewriting much of it to fit their needs; they also produced their own user manuals, removed lines of code that applied specifically to other retailers, and converted large, unwieldy chunks of code into smaller, more digestible programs. The Ship Systems division of Northrup Grumman has roughly 7 million lines of Cobol and Fortran code, and Northrup Grumman CIO Jan Rideout believes that mainframe code offers the reliability and security that newer, untested programs cannot promise. Her programming staff enjoys a low turnover, and new hires are taught Cobol right away. "Once people get over the it's-my-father's-Cobol thing, the young kids can be a little open-minded and get into these older systems and see that there are some interesting aspects to them," said Rideout. Despite the ease with which incremental changes to old code can be implemented, Ship Systems is moving away from mainframe code in favor of a commercial software package. As many developers are reluctant to give any attention to maintaining or updating Cobol systems, some IT executives are looking to offshore labor to squeeze more life out of old languages.
Click Here to View Full Article
From ACM's TechNews, July 1, 2005
"Why Linux Needs Rexx"
NewsForge (06/28/05); Fosdick, Howard
- Some Linux proponents see in Rexx the potential to give Linux the boost it needs to overtake the Windows desktop, writes Howard Fosdick, author of "Rexx Programmers Reference." IBM invented the scripting language many years ago, and its advocates cite numerous advantages, including free access, portability, standardization, and a widely established community of users. Rexx enjoys the preeminent position in mainframes, and is still widely remembered for having powered notable early desktops such as the Amiga OS and OS/2. Rexx also carries the important benefit over Perl of being an easy language to learn, while it is still powerful enough to be relevant in today's climate. Unlike Python, Rexx appeals to the casual user who might not naturally approach a problem on an object-oriented basis. Tcl/Tk falls short on the compatibility front, and it lacks Rexx's proven history. Two varieties of Rexx are available: "classic" Rexx and Open Object Rexx, which is completely object-oriented, offering programmers a choice based on their needs and level of comfort with the language. For Linux to compete, it needs a language such as Rexx that reaches out to a broad spectrum of users, rather than one confined to those with a high degree of technical expertise.
Click Here to View Full Article
From ACM's TechNews, June 27, 2005
"Java Faces Open-Source Swarm"
CNet (06/27/05); LaMonica, Martin
- In an industry moving inevitably toward open-source sharing, Sun Microsystems holds a tenuous grasp on the Java language it created. At the upcoming JavaOne conference, Sun will unveil GlassFish, which provides non-open-source access to code along with another project of an unspecified name that will pertain to the Java Business Integration specification. Sun faces mounting competition from other software developers making bolder forays into the open-source field, such as BEA Systems that will announce the open-source frameworks Spring and Struts at the conference. The open-source push, designed to generate a higher-quality Java model through industry-wide collaboration, has left Sun on the outside looking in, according to some experts. "The convoluted way that Sun has managed the whole process has cost them a great deal with their reputation in the developer community," says analyst Dana Gardner. While Sun has consistently taken the stance that central control over Java ensures greater compatibility, Sun has made some concessions to the development community, such as the Mustang edition of the Java 2 Standard Edition, which enables the viewing of code as it is written. Even as developers warm slightly to Sun as the company relaxes its grip on Java, competitors are offering appealing alternatives such as integrated development environments (IDE), "frameworks" that accelerate Java programming, scripting languages, and even the LAMP combination stack. The Java Community Process (JCP), which regulates application programming interfaces, inhibits Sun's evolution in the market compared with open-source alternatives, says Exadel CEO Fima Katz. Katz says, "Sun is losing momentum. Because of the slow (JCP) process, people are frustrated."
Click Here to View Full Article
From ACM's TechNews, June 20, 2005
"Software Advance Helps Computers Act Logically"
NIST Tech Beat (06/16/05)
- The International Organization for Standardization (ISO) is expected to approve a new software language that would enable a computer to "think" about the meaning of a command instead of just responding to it. The process specification language ISO 18629, developed by National Institute of Standards and Technology researchers and colleagues around the world, makes use of artificial intelligence and language analysis to achieve its intuitive qualities. About 300 concepts, such as "duration" and "sequence," have been incorporated into the software structure. The software language has a basic understanding of context-specific language, and would know that "turn on the coolant, before milling" means the first action continues after milling begins. ISO 18629 represents commands in the context of a manufacturing plan, and can handle the exchange of process, validation, production scheduling, and control information for guiding manufacturing processes.
Click Here to View Full Article
From ACM's TechNews, June 15, 2005
"Sun Builds a Fortress for Scientists"
SD Times (06/01/05) No. 127, P. 14; Lee, Yvonne L.
- Sun Microsystems is developing a technical language called Fortress that is envisioned as a successor to Fortran, although the new language will not be ready for at least another five years. The Defense Advanced Research Projects Agency is helping fund the development of Fortress in the hope that the language will ultimately produce technologies for government and industrial applications. In April of this year Guy Steele Jr., Sun fellow and principal investigator for the programming languages research group, introduced the Fortress language at the Sun Labs Day. Steele said that Fortress will make extensive use of libraries, thus making the language "growable" and flexible. "Whenever we're tempted to add a feature to the language, we ask ourselves, "Could this feature be provided by a library instead?" Steele said. He explained that Fortress will be similar to Java in that it compiles application components into platform-independent bytecode before runtime while interpreting parts of the application at execution. The language will attempt to be a leader in the field of symbolic programming of equations, and Steele predicts that scientists and mathematicians will be more productive by making programs resemble equations.
Click Here to View Full Article
From ACM's TechNews, June 3, 2005
"Evolving the Java Platform"
Software Development Times (05/15/05) No. 126, P. 33; Hamilton, Graham
- Sun fellow Graham Hamilton writes that a key theme of the next Java 2 Standard Edition (J2SE) and Java 2 Enterprise Edition (J2EE) iterations is "ease-of-development," the need to maintain a balance between power, richness, and simplicity to ensure that the new Java specs are easy to use. J2SE 5.0, code-named Tiger, features a mechanism within the Java language that lets developers particularize desired behavior by tagging source code with annotations. Currently under development is J2SE 6.0 (Mustang), and J2SE 7.0 (Dolphin): Mustang will include a full-scale scripting engine, among other features, while the Dolphin release is expected to include direct XML support and a new Java Virtual Machine instruction that aims for Groovy, Python, and similar "dynamic languages." J2EE 5.0, meanwhile, will feature significant changes to the transactional data access layer in Enterprise JavaBeans (EJB) 3.0. These changes involve substantial simplification of persistence mapping between relational database tables and in-memory Java objects as well as the rules for defining an object as a transactional EJB. The Java platform will support Web services and a service-oriented architecture for distributed systems through the JAX-RPC standard, which is being significantly streamlined in J2EE 5.0 using Java language annotations to specify the definition and use of Web services. Such Web services are envisioned to support interoperability between Java and .NET.
Click Here to View Full Article
From ACM's TechNews, June 1, 2005
"Java Sets Sail for the Final Frontier"
VNUNet (05/25/05); Sanders, Tom
- Java creator and Sun Microsystems vice president James Gosling says the programming language will enable radical new applications as it becomes more widely deployed on the edge of the network. Java-coded sensors at the bottom of San Francisco Bay could be used to build predictive models, or sensors could be used for an airplane's control system, while Java-enabled devices at the network edge could also help financial services companies make small business loans to rural farmers in developing countries, or make health care more efficient by enabling effective global resource allocation. Gosling says he envisioned none of these applications when he created Java, but knew from experience that inventive people would use unrestrictive technology to do unexpected things. Another aspect in early Java development was the realization that the technology would support functions normal people use everyday, and Gosling says the thought of his children using Java-dependent devices made him not compromise on security or reliability. Still, Gosling says system testing needs to be less expensive and complex; he cites the example of the Federal Aviation Authority, which uses an extremely difficult and costly testing regime. Sun Microsystems focuses on managing the growing number of Java libraries and the virtual machine, ensuring reliability, performance, and scalability. Java needs to be able to deal with large multi-processor chips that are emerging, and part of that challenge means improved development tools; the Java Development Kit 5.1 makes instrumentation always available for debugging code when it is actively being deployed. Gosling predicts the WS-* protocols will be more important in an engineering sense.
Click Here to View Full Article
From ACM's TechNews, May 25, 2005
"C++ Gets a Multicore Tune-Up"
TechWeb (05/24/05); Wolfe, Alexander
- University of Waterloo computer science professor Peter Buhr is offering a new set of extensions for the C++ programming language that aims to help software developers take advantage of multicore and multi-threading processors. Buhr has released the micro-C++ project under an open-source license and will present the technology to the Gelato Federation technical meeting this week in San Jose. Intel and Hewlett-Packard provided financial support for Buhr's project through the Gelato group, which promotes Itanium and Linux systems. Micro-C++ is not limited in terms of operating system or processor technology, and basically enables programmers to easily separate threads in their code using four new classes not included in the original C++ language; the code is then translated into normal C++ and converted into an executable image with help from a compiler and a micro-C++ runtime library. Buhr says a lot of software development has focused on Java over the last six years, but now people are turning again to C++. There are other projects and technologies that deal with C++ for multicore and multi-thread processing, including Posix threads, the Boost.org project, the Adaptive Computing Environment toolkit, and Sourceforge's C++ threads library, but none of these technologies has won widespread support or been incorporated into official C++ language. Buhr also works on the C++ subcommittee that is exploring revisions for easier multicore programming, but he says there is currently no clear path for adding that functionality.
Click Here to View Full Article
"Complexity, Chemistry, Commuting and Computing"
ITWorld.com (05/19/05); McGrath, Sean
- The programming language wars are likely never to go away because they represent different people's opinions about how to manage inherent complexity, writes XML expert and Propylon CTO Sean McGrath. Tesler's law states that for every business process, there is a base level of complexity that can never be erased, only moved. Sometimes eliminating complexity is not even a goal, but simply offloading it to somewhere else is: It all depends on point-of-view and thresholds for risk, stress, time, and other factors. In terms of software programming, different programming languages deal with complexity at different levels; Perl arms algorithm designers with a number of language features with which to handle complexity, while Java and C# rely on standard libraries such as JDK and CLR, respectively. Python programmers use a core set of complexity management devices repeatedly for a variety of situations. Each of these approaches deals with inherent complexity in a different way, and which one is correct will depend on the programmer or team's tolerance for complexity and point-of-view. The same concept was behind the RISC and CISC debate, in which some saw a small number of fast instructions as more useful than a broader range of instructions running slower. Instead of focusing on eliminating complexity with a preferred tool, programmers should realize there is inherent complexity that is addressed in different ways with different tools.
Click Here to View Full Article
From ACM's TechNews, May 20, 2005
"COBOL Skills Needed in the Future"
Search390.com (05/17/05); Stansberry, Matt
- COBOL is not necessarily a thing of the past, considering that use of the mainframe has been growing and the enormous amount of the COBOL code currently running. In fact, programmers who are skilled in the legacy language are likely to be in high demand over the next decade when many COBOL programmers retire. Preliminary results of a Micro Focus International survey of 750 mainframers in the United States and Canada indicate that the median age of the COBOL programmer is 45-49. When the final results of the survey are made available in June they are expected to show that 52 percent of mainframe applications are still written in COBOL, and that 41 percent of mainframers consider COBOL to be a principal programming language by a margin of about 25 percent over Java. Ron Kizior, a Loyola University Chicago School of Business professor who was involved in the research, says the skill that will be in demand in the years to come will be the ability to integrate COBOL with Web-oriented development tools. Only 10 percent of mainframe shops were found to have gone to the Web. "Some of the more enlightened universities are trying to get to grips with this emerging requirement, and are bringing the necessary mainframe components back onto the syllabus," says analyst Mark Lillycrop.
Click Here to View Full Article
From ACM's TechNews, May 5, 2005
"At 10, Java's Wild Success, Missed Chances"
IDG News Service (05/05/05); McMillan, Robert
- Sun Microsystems' Java programming language celebrates its 10th birthday this month, having grown from a language aimed at Web developers to a much broader collection of software and specifications used on mobile phones and mainframe computers. "It's been a rocket ride that nobody expected would ever get near this far," says Sun President Jonathan Schwartz. Java was originally left-over code from the FirstPerson interactive television venture and was released by Sun in 1995 as a way to animate images on Web sites. Java's main appeal was its "write once, run anywhere" promise that freed developers from having to compile their code for different hardware. The freely available source code was integrated with the Netscape Navigator Web browser and drew 6,000 attendees to the JavaOne conference in 1996. JavaLobby.org founder Rick Ross says the language drew together the large IT vendors in an unprecedented way. Analyst James Governor notes Java dramatically changed IBM's software group by proving IBM could successfully build on something it did not own, a model IBM then followed with Linux. For Sun, however, the success of Java was mixed. While Sun was not successful as competitors in marketing its Java developer tools and application servers, the company did benefit from somewhat stymieing Microsoft's .Net effort, which Sun CEO Scott McNealy says would have closed the window on Sun's hardware business. Sun also missed an opportunity to get Java on the desktop, spending seven years fighting Microsoft in court over its desktop implementation. Meanwhile, Sun also made it difficult for companies such as Apple and Intel to contribute to Java, Ross says. Looking back, it is important to remember how quickly Java grew in such a short time. "There were so many opportunities, it was hard to know what to do," says Sun vice president and Java platform fellow Graham Hamilton.
Click Here to View Full Article
From ACM's TechNews, April 27, 2005
"Open Source Developers Provide 'Glimmer of Hope'"
ZDNet UK (04/22/05); Marson, Ingrid
- DAFCA object architect and software design expert James Coplien said in an interview at the ACCU conference that companies' rush to bring software to market is fueling a decline in product quality, noting that the open source community subscribes to higher quality software standards. "The one glimmer of hope is the people who've said, 'Screw the industry, we're going to write excellent software and give it away,' in other words, the open source movement," he declared. Coplien argued open source software is more secure than closed source or proprietary software because it is borne out of a collaborative effort between contributors and a central community of developers. "The complementary, independent, selfless acts of thousands of individuals [in the open source community] can address system problems--there are thousands of people making the system stronger," he said. Coplien also said open source software is better tested than proprietary because it is scrutinized by more people, who are encouraged to look for glitches. Several industry experts at the ACCU conference disputed Coplien's arguments: Texas A&M University professor and C++ inventor Bjarne Stroustrup said not all open source software is of high quality, adding that some of the best code in existence is closed source. Meanwhile, Cambridge University security engineering professor Ross Anderson argued that the easy detection and patching of vulnerabilities in open source software, which relies on the wide availability of code, can be exploited by hackers.
Click Here to View Full Article
From ACM's TechNews, April 25, 2005
"C++ Creator Upbeat on Its Future"
CNet (04/22/05); Marson, Ingrid
- C++ programming language inventor and Texas A&M University professor Bjarne Stroustrup said at the ACCU Conference that a backlash against newer languages such as C# and Java has sparked a resurgence in C++ usage, claiming there are now upwards of 3 million C++ programmers. He said the lack of a "propaganda campaign" is the chief reason why people are ignorant of this trend. For example, Sun Microsystems aggressively hyped Java's role in the Mars Rover program, while Stroustrup says C++ was employed for scene analysis and route planning in the vehicle's autonomous driving system. Evans Data appears to challenge Stroustrup's claim of C++ growth with data indicating a 30 percent decline in the percentage of developers using C++ between spring 1998 and fall 2004, although the firm expects the decrease to slow considerably over the next several years. A recent Forrester Research survey of over 100 companies estimated that 59 percent of respondents used C/C++ in production systems, 61 percent used Visual Basic, and 66 percent used Java, leading Forrester analyst John Rymer to conclude that Stroustrup's assertion of approximately 3 million C++ developers is "plausible." RedMonk analyst James Governor said the assumption that Java and Microsoft languages such as Visual Basic and C# are the primary languages used by developers is erroneous. "C++ still has a role and dynamic scripting languages, such as PHP and Python, are growing, not shrinking, in importance," he declared.
Click Here to View Full Article
From ACM's Queue, April 18, 2005
- The thorough use of internal documentation is one of the most-overlooked ways of improving software quality and speeding implementation.
Read the Article
From EduPage, April 8, 2005
U.K. INITIATIVE BACKS OPEN SOURCE
CNET, 7 April 2005
- The government in the United Kingdom is sponsoring an initiative designed to encourage use of open source applications among public agencies. The Open Source Academy will offer a number of resources to public-sector offices, including a code repository and an open source platform with which agencies can collaborate on software projects. The Open Source Academy will also share news and information about the adoption and use of open source technology in a variety of sectors.
Mark Taylor, executive director of the Open Source Consortium, which is involved in the new initiative, said the public sector in the United Kingdom trails that in other European countries and that the new program should help close that gap. Taylor also commented that the notion that open source is only beneficial to poorer countries is a myth he hopes to dispel.
Read the Article
From ACM's TechNews, April 4, 2005
"Computerworld Development Survey Gives Nod to C#"
Computerworld (03/28/05)
- A Computerworld survey of developers found that 72 percent of respondents used Microsoft's C# programming language, while 66 percent used Java; the third, fourth, and fifth most-used programming languages were Visual Basic (62 percent), C++ (54 percent), and JavaScript/ECMAScript (50 percent). Microsoft .Net was the framework/API of choice for 51 percent of respondents, and 48 percent said Unified Modeling Language was not in use at their organizations, compared to 33 percent reporting that it was. Thirty-seven percent said they used mostly Java, 26 percent said they used mostly .Net, and 23 percent reported using both Java and .Net. Half of the polled developers said they use open-source code, although the majority said that 64-bit applications, Linux applications, or wireless applications are not under development at their organizations. Fifty-eight percent reported that their organizations were already utilizing Web services and developing more projects; 15 percent said pilot Web services projects were underway, 12 percent claimed to have an active interest, and 9 percent indicated no interest. An integrated development environment (IDE) was the preferred editor for 45 percent of respondents, while 35 percent said they used both IDE and text editors and 17 percent said they favored text editors. Thirty-eight percent of those surveyed held the title of IT manager, and 36 percent held the title of developer. Thirty-nine percent of respondents said their company produces or sells software and services for internal use only, and 58 percent indicated that they develop applications that are deployed for corporate-wide or business-to-business use.
Click Here to View Full Article
From ACM's TechNews, March 30, 2005
"Brazil: Free Software's Biggest and Best Friend"
New York Times (03/29/05) P. C1; Benson, Todd
- Brazil is becoming free software's largest benefactor with President Luiz Inacio Lula da Silva's mandate that government ministries and state-run businesses transition from expensive proprietary operating systems to free operating systems in an effort to save millions of dollars in royalties and licensing fees. Brazil's government plans to implement its PC Conectado (Connected PC) program to help low-income Brazilians purchase their first computers by the end of next month, and National Institute of Information Technology President Sergio Amadeu wants the computers the program offers to run free software only. "It's the government's responsibility to ensure that there is competition, and that means giving alternative software platforms a chance to prosper," Amadeu says. It is estimated that just 10 percent of Brazil's 183 million people are connected to the Internet, while only 900,000 computers are sold legally every year. The PC Conectado program aims to give computer makers tax incentives in return for dramatically discounting their products, while consumers would be able to take advantage of a plan in which they pay for desktops with 24 monthly installments of 50 to 60 reais (about $18 to $21.80), considered to be an affordable price for many working poor. Some people believe the government should apply its tech programs to schools and other areas where they argue a more pressing need for computers exists, and the government has promised complementary programs to spread computers' school presence as well as open many poor-area computer centers with free software and free Internet access by the end of 2005. MIT Media Lab director Walter Bender sent a letter to Brazil's government in which he stated that "free software provides a basis for more widespread access, more powerful uses and a much stronger platform for long-term growth and development."
Click Here to View Full Article
"Report: P-Languages Better for Enterprise"
InternetNews.com (03/25/05); Singer, Michael
- A Burton Group report finds that P-Languages such as Perl, Python, and PHP have come a long way over the last several years thanks to their ability to complement the use of G-Languages such as Java, C++, and C#, and suggests that P-Languages should be favored over G-Languages because of their performance in enterprise scripting and other mission-critical functions. "P-Languages...should be viewed as additional, albeit first-class tools that information technology organizations can use to solve enterprise-scripting problems," recommends Burton analyst Richard Monson-Haefel in the report. He notes that PHP can dramatically simplify the dynamic generation of HTML and the processing of HTTP requests, while Perl is often employed in Unix and Linux system administration as well as for batch transformations of text data. Python, meanwhile, finds use in system administration, text processing, and application development, and frequently serves as "glue" code. A single line of code in a P-Language could typically execute the same number of tasks as five lines of G-Language code, thus reducing developers' writing and debugging chores and making it easier for them to learn unfamiliar systems, according to Monson-Haefel. P-Language pick-up has been significant in open-source platforms, while Burton says that Perl and Python are no more vulnerable to hacker intrusion than most programming languages. However, products that use PHP as a platform appear to be particularly vulnerable, and Monson-Haefel says the PHP community must make a more conscientious effort to toughen up the language.
Click Here to View Full Article
From ACM's TechNews, March 25, 2005
"The 'dotCommunist'"
Chronicle of Higher Education (03/25/05) Vol. 51, No. 29, P. A31; Foster,Andrea L.
- Columbia University law professor and Free Software Foundation general counsel Eben Moglen is a fervent believer in free software as part of his struggle to promote freedom of speech and advance knowledge. He defines software as a "public utility," and argues that software patents and other attempts to restrict its use or sharing are immoral. Creators of open-source software, which Moglen views as vital to sustained innovation, license it under usage terms designed to prevent companies from commandeering and commoditizing the software, but commercial software makers are targeting such licenses by arguing that free software is hurting their bottom line. The recently established Software Freedom Law Center, which Moglen heads, will help producers of open-source software surmount these challenges by providing free legal advice, and also by refining and enforcing open licenses. The companies supporting the new center share Moglen's vision of open-source systems eventually trouncing Microsoft's software market monopoly and helping to bring down the current system of information ownership. Moglen sees digital information-exchange media as important tools for improving society, and notes that the Internet has enabled economically disadvantaged people to access the same information that the economically advantaged have. On the other hand, Boston intellectual property lawyer Steven Henry believes proprietary and open-source software will exist alongside each other for quite a while, and says open-source software will not thrive unless licenses that fortify businesses' ability to integrate code from both free and commercial software are crafted.
From ACM's TechNews, March 23, 2005
"Tool Turns English Into Code"
Technology Research News (03/20/05); Patch, Kimberly
- MIT researcher Hugo Liu believes the Metafor program for translating natural-language descriptions of software into scaffolding code--a program "skeleton"--could be practically applied to software brainstorming in less than two years; later applications could include a programming teaching tool for kids and enhanced storytelling. Liu explains that Metafor renders the innate structure of English as a fundamental programmatic architecture of class objects, properties, functions, and if-then rules. "The basic ingredients for the program are there--the noun phrases are objects, the verbs are functions, [and] the adjectives are object properties," he notes. MIT researchers crafted a parser that deconstructs text into subject, verb, and object roles, and programming semantics software that maps data from these English language constructs to basic code structures; this data is then applied toward the real-time interpretation of skeleton code in any of seven programming languages. Metafor displays four panels to the user: One panel for entering sentences, and three panels containing canned dialogue that verifies the user's statement, debug data, and the programming language version of the code. Running a mouse over a code object produces a pop-up window displaying an English explanation of the code, says Liu. The researchers studied the performance of groups of intermediate and beginning programmers using Metafor, and found that the program accelerated brainstorming about 10 percent for intermediates and 22 percent for beginners. Liu observes that Metafor encouraged programmers to use simple and declarative language, which subsequently made the system more effective for brainstorming and outlining.
Click Here to View Full Article
From ACM's TechNews, March 21, 2005
"Agile Breaks on Through to the Other Side"
Application Development Trends (03/05) Vol. 12, No. 3, P. 22; Swoyer, Stephen
- Agile software development approaches such as eXtreme Programming (XP) can boost productivity and enable products to arrive at projected delivery dates and fulfill expectations by skipping the bureaucracy typical of the classical "waterfall" software development model, where programming often takes a back seat to planning and documentation. The agile development model supports a Whole Team strategy wherein line-of-business representatives are involved as team members from the beginning; they frequently meet with programmers to provide as much input regarding the final product's features, functionality, and performance as possible. Though agile methods differ from project to project, they are unified in their emphasis on customer interaction, pair-partnering among developers, fast coding with test-driven development and regular refactoring, and planning. Agile approaches can be a hard sell to management, given increasingly risk-averse corporations' unfamiliarity with the methodology. To overcome this reticence, advocates suggest a gradual introduction of agile methods in areas where they make the most sense. Primavera Software's Bob Schatz notes that an organization's willingness to adopt XP or other agile methods is predicated on the pain it suffers because of its reliance on non-agile methods. Author and XP guru Ron Jeffries says that an agile effort's success depends on getting management on board, preferably from at least two levels up, and the best enticement for management is working software. Most coders voluntarily pursue agile programming without any pressure to do so, but some programmers are getting introduced to agile methods on management's orders, or as part of starting a new position with a new employer.
Click Here to View Full Article
From ACM's TechNews, March 18, 2005
"Irish Open-Source Groups Protest Software Patents"
eWeek (03/17/05); Broersma, Matthew
- Open-source advocates throughout the European Union (EU) are concerned that a directive officially endorsed by the EU Council will legitimize "pure" software patents and jeopardize European contributions to open-source projects. Irish open-source groups fired off a briefing document to Ireland's Members of European Parliament (MEPs) as an answer to members' own inquiries about software patents, reports open-source activist Barry O'Donovan. O'Donovan set up a form on KDE.ie as a way for constituents to get in touch with Irish MEPs to air their concerns about software patents, and he estimates that some 400 emails were sent; some MEPs responded to the missives by requesting more information on the EU patent directive and its potential ramifications for Irish research and industry. MEPs have three months to either modify or discard the draft directive, which is currently going through a second reading in Parliament. Ireland's former Minister of Finance, Charlie McGreevy, is the European Commissioner overseeing the directive. The briefing document, which was sent out by KDE Ireland, the Irish Linux Users' Group, and the Irish Free Software Organization, lists 10 reasons why software should not be patentable, including the fact that its abstract nature would make it impossible to reliably avoid patent infringement. Also featured in the document are excerpts from the U.S. Federal Trade Commission's October 2003 "Report on Innovation," which warned that software patents are "impairing follow-on incentives, increasing entry barriers, creating uncertainty that harms incentives to invest in innovation, and producing patent thickets."
Click Here to View Full Article
From ACM's TechNews, March 16, 2005
"Programming Wizards Offer Advice at Developers Confab"
eWeek (03/15/05); Schindler, Esther
- This week's Software Development Expo is an event where programmers can pick each other's brains on effective strategies for developing quality code, and selecting methodologies that work for developers and their team has been an overarching theme. Ronin International consultant Scott Ambler noted that most software development projects come in behind schedule, and every project participant is aware that the schedules are unworkable; he believes software methodologies such as Scrum, in which iterative development is funded in very brief time frames of no more than a month, is an effective solution, one that cuts bureaucracy and allows developers to learn what features are important to them upfront. Gerald Weinberg, a 50-year software development veteran and author of "The Psychology of Computer Programming," said developers' habit of misrepresenting the progress of projects to management has not changed in the last half century. "[Management doesn't] know what you're doing, so they reward the appearance of work," he remarked. Although the primary goal behind developers' acceptance of methodologies is a desire to produce quality code that does what customers want, they are also aware of the challenges to management; Sun Microsystems Java architect David Heskell said it is often mistakenly believed that methodologies that work well on one kind of system are universally applicable to all projects, an assumption negated by the reality of dynamic project needs, technical complexity, and people issues. Google principal engineer Joshua Bloch recommended in a panel discussion on agile methods that developers be especially cognizant of variable naming strategies. His advice was to try to make code read as close to English as possible.
Click Here to View Full Article
"Looking at Open Source Software Through Arabeyes"
Islam Online (03/16/05); Noronha, Frederick
- A group of Arabic-speaking volunteers is working to create open source tools that will make computing more accessible to hundreds of millions of people who use Arabic lettering; the Arabeyes project aims to establish itself as the nexus of Arabization efforts, which previously have been initiated by overseas students and stopped when those students finished their studies. Besides Arabic, the effort will also benefit other South Asian scripts such as Urdu, Pashto, Farsi, and even Hindi. In all, volunteer Mohammed Sameer from Cairo estimates between 225 million and 400 million people will directly benefit from these efforts, and he sees the core group of about 13 daily contributors as the Arabic-speaking equivalents of Richard Stallman or Linus Torvalds. Top Arabeyes contributors include volunteers from Sudan, Algeria, Iraq, and Lebanon. So far, Arabeyes has produced translated versions of Gnome and KDE Linux desktops, the OpenOffice.org 1 suite, and multi-platform text editor Vim; ongoing projects include OpenOffice.org 2, the Firefox browser, and an all-new Akka software layer that allows Arabic to be used on text-based Linux and Unix consoles, while other Arabic-specific projects include the Duali Arabic spelling checker and the Web dictionary interface QaMoose. Arabeyes.org has more than 500 registered users and focuses on programs for Linux. Sameer welcomes people who want to help on localization projects, and suggests they first think about the language needs and look for places where they can fill the gaps. Sameer is also a member of the Egyptian Linux user group responsible for promoting Linux in that country.
Click Here to View Full Article
"Open-Source Movement Now In Hands of Hired Guns"
Investor's Business Daily (03/15/05) P. A4; Brown, Ken Spencer
- Corporate programmers have for the most part supplanted volunteer programmers as developers of core open-source software. IBM committed $1 billion to the development and promotion of the open-source Linux operating system four years ago, and has since made over 500 software patents and 30 software applications freely accessible to open-source programmers. "As Linux goes mainstream, the market gets bigger and the dollars available around the world grow, it becomes a great business opportunity," notes Open Source Development Labs CEO Stuart Cohen. Many companies are devoting their developers' time to the improvement of Linux in the hopes of ensuring that the OS is compatible with their hardware and software, while Cohen says some firms are gambling that increasing demand for Linux will in turn raise sales of related products. Corporate involvement benefits Linux by enhancing the OS with industrial-grade features that volunteer programmers would take years to develop. Linux creator Linus Torvalds is not concerned about any company dominating the development of Linux so that it gains a competitive advantage over rival Linux firms, because open-source development follows a democratic model to guarantee that only the best ideas prevail. In addition, improvements to the software are available to anyone through Linux's open licensing scheme. Andrew Morton, a chief deputy of Torvalds', maintains that most programmers, even commercial ones, develop a sense of loyalty to Linux that is stronger than corporate fealty.
From ACM's TechNews, March 16, 2005
"Key Open-Source Programming Tool Due for Overhaul"
CNet (03/14/05); Shankland, Stephen
- The widely used GNU Compiler Collection (GCC) will receive new optimization capabilities that should boost performance of open-source software compiled using the tool, including Linux, Firefox, OpenOffice.org, and Apache. GCC 4.0 will add new optimization technologies that enable the compiler to not only optimize local portions of a program, but take the overall structure of the program into account; using technology called scalar replacement and aggregates, data structures spread over a large amount of source code could be broken up, allowing object components to be stored directly in on-chip memory instead of main memory, for instance. A framework called Tree SSA (static single assignment) is also available for optimization plug-ins written later, says CodeSourcery "chief sourcer" and GCC 4.0 release manager Mark Mitchell. The result of the upgrades should be more efficient and effective translation of codes from high-level languages such as C into binary code understood by computers; open-source programs that use GCC will also run faster. Mitchell says GCC, included in the original Gnu's Not Unix effort and launched in 1987, is becoming more professional and commercial, along with other open-source software. There are now about 10 GCC core programmers dedicated to GCC's development, while Intel, which offers its own high-quality compilers for x86-run software, also contributes to GCC development because so much GCC-compiled software runs on the x86 chip platform. Pathscale also offers a performance-oriented open-source compiler that is derived from Silicon Graphics' Open64 compiler for scientific programs. Pathscale's Len Rosenthal says the goal is to make Open64, which is compatible with GCC 3.3, the default compiler for the x86 platform.
Click Here to View Full Article
The PatternShare Community
From ACM's Tech News, March 2, 2005
"Cracking Software Development Complexity"
Line56 (03/01/05); El Baze, Nicolas
- Software complexity continues to increase faster than Moore's Law, meaning that programmers will be unable to write and manage code without rethinking how to create programs, writes Partech's Nicolas El Baze. A number of companies are addressing this issue and moving toward model-based frameworks that will allow non-expert employees to rapidly execute new and complex tasks. Language provides a good metaphor for the type of models that are needed to deal with increasing software complexity: children develop a linguistic model that lets them quickly input and use new words; similarly, a model-driven approach would not only provide visual abstraction of programming tasks, but actually provide reusable intelligence and be context-aware. Generic application software models would even accommodate new languages by adapting, extending, and evolving existing models. Ideally, programmers would simply enter data, parameters, concepts, and goals to automatically produce working code. Some companies are already pursuing codeless development for specific functions, signaling a shift of intellectual property value from the code itself to the models with reusable intelligence. A model-driven approach to software creation could also help with enterprise application integration, especially as these models are made context-aware through use of semantic and ontology-based technologies; this framework would also have to deal with new distributed applications such as J2EE and .NET applications. Model-driven software development will not be developed in a short time, but will require hard scientific research similar to how the DNA mapping process was uncovered and is even still being transformed into an actionable model.
Click Here to View Full Article
From ACM's Queue News, February 28, 2005
Quality Assurance: Much More than Testing: Good QA is not only about technology, but also methods and approaches
Quality Assurance issue, vol. 3, no. 1 - February 2005 by Stuart Feldman, IBM Research
Article excerpt:
- Quality assurance isn't just testing, or analysis, or wishful thinking. Although it can be boring, difficult, and tedious, QA is nonetheless essential.
- Ensuring that a system will work when delivered requires much planning and discipline. Convincing others that the system will function properly requires even more careful and thoughtful effort. QA is performed through all stages of the project, not just slapped on at the end. It is a way of life.
QQQQQ QQQQQ WHAT QQQQQ IS QQQQQ SOFTWARE QQQQQ QUALITY QQQQQ ASSURANCE?
- IEEE Standard 12207 defines QA this way: "The quality assurance process is a process for providing adequate assurance that the software products and processes in the product life cycle conform to their specific requirements and adhere to their established plans."
This sentence uses the word process three times. That is a key aspect of QA--it is not a single technology, but also a method and approach.
- Another key point is that quality is not treated as a philosophical issue, but, rather, as measurably meeting expectations and conforming to requirements. The rigor of the process should be chosen to suit the needs of the product and organization.
- Finally, QA is about providing assurance and credibility: the product should work right, and people should believe that it will work right.
- What goes into QA? Testing, of course, is a key activity. There is, however, an adage that "you can't test quality into a product." A solid test plan should catch errors and give a measure of quality. A good QA plan ensures that the design is appropriate, the implementation is careful, and the product meets all requirements before release. An excellent QA plan in an advanced organization includes analysis of defects and continuous improvement. (This feedback loop is characteristic of mature organizations.)
- For physical products, QA involves manufacturing process control, design reviews, test plans, statistical methods, and much more. In relaxed implementations, there is occasional monitoring of the production line, and a few pieces are examined at each stage. In extreme cases, every step is monitored and recorded, intermediate products are torture tested with stresses exceeding the specifications, and many final products are destroyed. (Crash testing isn't always a
metaphor.) Only a few outputs make it into the field.
- For software products, there are many QA process choices, depending on the structure of the organization, the importance of the software, the risks and costs of failure, and available technologies. These should be conscious decisions that are recorded and revisited periodically.
QQQQQ QQQQQ HOW QQQQQ STRINGENT QQQQQ MUST QQQQQ QA QQQQQ BE?
- In an ideal world, perfection would be the norm. In the real world, you must make trade-offs. Although some people claim that "quality is free," that is rarely the case. After much trial and error, you may arrive at a well-honed process that delivers high quality reliably and efficiently. Until you achieve that stage, demonstrably higher quality usually involves a longer and more expensive process than simply pushing the product out the door.
- There are many types of requirements to be QA'd. Some involve meeting basic functional specifications: the system or program does the right thing on expected (or unexpected) inputs. Some involve performance measures such as throughput, latency, reliability, and availability.
- Other major considerations depend on the operating environment. If the users will have limited understanding or ability to repair problems, the system must be validated on novices. If the system must operate in many contexts, interoperability and environmental tolerance must be verified.
- In certain applications, the costs of failure are so high that it is acceptable to delay until every imagined test and cross-check has been done. In others, repairs are acceptable or affordable, or misbehaviors are tolerated. Just as a bank runs different credit checks on people who want to borrow $1,000 and those who want $1 million, different QA processes are appropriate for spelling checkers and cardiac pacemakers. Much of the fundamental work on high-reliability systems was done for military, aerospace, and telecommunications applications that had extremely rigorous requirements (and large project budgets); telephone switches and mainframes rarely fail.
- The spectrum of QA rigor covers a wide range:
- ·Research and experimental software. Requirements for quality may be quite low, and the process may be little better than debugging and a few regression tests. Nonetheless, the risks of embarrassment from failed public demos and withdrawn papers suggest a greater investment.
- ·Business and productivity and entertainment tools. These are expected to work, but the occasional failure is (alas) no surprise. When the consequences of a crash or invalid result are acceptable, it may not be worthwhile to invest in a long QA cycle (or so many vendors say).
- ·Business-critical tools. A much higher standard of planning and testing is required for key organizational software. Software that manages transactions for significant amounts of money, affects people directly, or is required for legal compliance needs to be credible, as well as functional. Errors can destroy an organization or its executives. Any development plan needs a significant investment in quality assurance, including careful record keeping and analysis.
- ·Systems that are widely dispersed or difficult to repair. When it is difficult or expensive to access all the products, there is justification for extensive testing and design for remote repair. If 1 million copies of a game are sold with a highly visible flaw, the cost of upgrading and repairing could easily exceed the profit. A chip with a design flaw or an erroneous boot ROM can lead to the same unfortunate result. Heavy testing in a wide variety of environments is needed to build confidence, even if product launch is repeatedly delayed. In the extreme, it may be impossible to get to the product because it is embedded in equipment or extremely distant; if the mission is important, the products must be designed for remote repair and/or have unusually high quality standards. Examples include famous exploits for repairing space missions millions of miles from home.
- ·Life- and mission-critical software. Failures of some systems can cause loss of life (braking systems, medical devices) or large-scale collapses (phone switching systems, lottery management systems). In such cases, elaborate QA is appropriate to avert disaster. It is not unusual for testing and other QA steps to absorb more than half of the elapsed development time and budget. Analysis must extend far beyond single components and functions--the behavior of the entire system must be assured.
- Read the rest of this article at acmqueue.com http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=276
From ACM's Tech News, February 28, 2005
"Too Darned Big to Test"
Queue (02/05) Vol. 3, No. 1, P. 30; Stobie, Keith
- As software grows bigger and more complicated and concurrency and distributed systems become commonplace, handcrafted tests become a less reliable means of spotting bugs, writes Keith Stobie, a test architect in Microsoft's XML Web Services group. Keeping test methods economical requires testers to better comprehend their techniques' fault models and their application so that they can be improved over stochastic testing. Addressing large systems entails good unit testing (including good input selection), good design (including dependency analysis), good static checking (including model property checking), and good concurrency testing up front. Selecting and prioritizing test cases through code coverage, customer usage data, and the all-pairs approach to controlling the number of different configurations can increase efficiency. However, code coverage should not necessarily be employed to judge tests. Test coverage and solid stochastic tests can be produced through the use of models, which can also function as test oracles. Convincing testers to embrace model-based testing is a key challenge, as many testers lack the training to consider what they are testing in an abstract manner. Rigorous unit testing and test-driven development must be required by test groups and accepted by product developers, while controlling unit dependencies ensures the achievability of integration quality.
From Queue E-Mail Newsletter, October 13, 2004
Linguae Francae: Is programming language a misnomer?
by Stan Kelly-Bootle
From the Programming Languages issue, vol. 2, no. 9 - Dec/Jan 2004-2005
article excerpt:
Many linguists are still busy trying to reconstruct the single ur-language presumed to have evolved over untold millennia into the thousands of human tongues--alive and dead, spoken and written--that have since been catalogued and analyzed.^1 The amazing variety and complexity of known languages and dialects seems, at first parse, to gainsay such a singular seed.
Yet many find a deep, mythic confirmation in the Tower of Babel story: "Now the whole world used the same language and the same words" (Genesis 11:1). And, of great computer-theoretic significance, note how Jahweh upsets this Esperantic paradise and launches his logomachic crusade: with a raging, unstructured "GOTO!" followed by "Let us go down and confound their language, that they may not understand one another's speech" (Genesis 11:7). Others, however, especially Dijkstra, see this GOTO as the instruction from hell.^2
Speaking of "Structure" (with a reverent capital S), we move to a major milestone in linguistic theory, and one that seems to argue against the Babel account. I refer to Ferdinand de Saussure's Cours de linguistique génerale, published posthumously in 1916. Irony strikes again. The text, considered to be the genesis of structural linguistics and semiotics, was cobbled together from fragmented lecture notes taken by his students Bally and Sechehaye.^3 To cut one of mankind's most convoluted books short, Saussure's famous gist is the "essential arbitrariness" of the correlation between signifiants and signifiés--that is, between signifiers (words, for instance) and the objects they signify. Thus, there's nothing particularly tree-like about the word tree, or even its French equivalent arbre. You may raise the onomatopoeic objection, to which I say: French cats don't "purr," they go "ronron"; and French ducks "ne quackent pas," they prefer to "faire coin-coin."
Saussure's thesis, now universally recognized (with inevitable variations and refinements), can be invoked to cast doubts about whether there was ever one primeval lexicon from which our current "Babel" emerged. Many languages do fall into plausible families and subfamilies. Indeed, the birth of comparative linguistics can be traced to the success in establishing a taxonomy for the IE (Indo-European) super-family. Widespread though IE members are (with English, or rather Englishes, established as the dominants for international discourse), there are some 4,000 more-or-less extant non-IE languages.
Grouping these into larger and larger families is proving to be a rather suspect exercise, since many "creative" (as in "creative accounting") guesses are involved in constructing proto-languages with little or no written evidence. There's no end of wishful leaps based on pure coincidences in shape and sound. And even the same word root found in daughter languages does not imply that that word existed in the proto-parent. As Jared Diamond points out, "Archaeologists [unfairly?] skeptical of linguists' attempt to reconstruct mother tongues love to cite words like `Coca Cola,' shared among many modern European languages."^4
Of more immediate concern to my readers is the interaction between NL (natural
language) and so-called programming languages. I say "so-called" because many linguists consider programming "languages" to be the most egregious misnomer since the Big Bang (which I parochially date to 1949 when the Cambridge EDSAC I
passed the perfect benchmark by listing hundreds of random numbers!). Yet, whether we like it or not, we are stuck with the word language in a fresh context: sequences of symbols with artificially "frozen" syntaxes aimed at the precise control of machines. This usage is not entirely new. Galileo hinted at the Platonic algorithms that run our cosmos: "No one will be able to read the great book of the universe if he does not understand its language, which is that of mathematics."
We face the familiar, false dichotomy: vague, ambiguous NL versus exact, unambiguous mathematical symbols and axioms. David Lorge Parnas, explaining his techniques for module specification in 1972 when software engineering was emerging with giddy enthusiasm, stressed, "It should be clear that while we cannot afford to use NL specifications, we cannot manage to do without NL explanations. Any formal structure is a hollow shell to most of us without a description of its intended interpretation."^5 And here, I would like to add the need to spell out the motivation. (There's the familiar Euler-Gauss stylistic gulf in mathematical exposition: Euler was lavish in exposing the motives and intermediate arguments in his proofs. Gauss preferred a blunt sequence of steps as if to say, "Follow that, dummies!")
Parnas goes in the other direction, from NL to
formal, when choosing names for identifiers: "...if one makes use of names with a high mnemonic value, both reader and writer tend to become sloppy and use the intended interpretation implied by the mnemonic name to answer questions which should be answered by the formal statements."^6
In the early, "unExtended" Basic days, one had little opportunity for concocting memorable, self-explanatory identifier names. A$ was certainly a string, possibly the employee's name, and her gross pay was the integer A (in cents)! Later on came the Hungarian invasion, reminding us that szTring both looks and sounds like a Magyar string!
Parnas therefore warns against gross_pay, deductions, and net_pay. But, what if the code reads
net_pay := gross_pay + deductions;
Whom to believe?
On the other hand, editor Edward Yourdon himself confesses that his "eyes often glaze over" reading page after page of Parnas's Spartan modules. The latter upholds another Parnas structured stricture: "Avoid redundancy. Specifications should tell programmers exactly what they need--no more and no less."
Read the rest of this article at acmqueue.com http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=275
From Lansing City Pulse, February 11, 2005
"MSU Researchers Build a World and Watch It Grow"
Lansing City Pulse (02/09/05); Cosentino, Lawrence
- Michigan State University (MSU) is leading development of the "evolutionary
computation" field with little replicating programs called Avidians, named for
the application in which they are birthed and live for many hundreds of
thousands of generations. MSU's Digital Evolution Laboratory has published the
work in the scientific journal Nature and in Discover magazine. The
replicating software code is special because it evolves to solve particular
problems in ways that humans might not have considered; users of the Avida
program craft experiments, such as seeing how few steps it would take to
compute a particular logic operation. Even though the program's author and the
lab director believed the operation could not be completed in less than 19
steps, the Avidian programs evolved to compute the operation in just 11 steps.
MSU philosophy professor Robert Pennock says Avida is significant because the
final solutions are created entirely by the software without human
intervention. The dozen or so graduate students working in the lab are testing
all sorts of theories, such as the benefits of altruism, evolutionary
advantages of sex, and the role of parasites, while doctoral computer science
student Matt Rupp is working on something slightly different: He is attempting
to create the correct conditions for accidental replication, a discovery that
Rupp says would help validate scientists' hypotheses about the origin of life.
Pennock says evolutionary computation has many practical uses as well, such as
the ability to quickly solve complex engineering problems, such as where to
build roads, how to schedule university classes, or timing stoplights.
Automakers are already developing the technology to help them design vehicles,
he notes.
How Not to Write FORTRAN in Any Language: There are characteristics of good coding that transcend all programming languages, by Donn Seeley, Wind River Systems
Extensible Programming for the 21st Century: Is an open, more flexible programming environment just around the corner?
Languages, Levels, Libraries, and Longevity: New programming languages are born every day. Why do some succeed and some fail?
From ACM's TechNews, October 15, 2004
"A New Event in Programming?"
CNet (10/11/04); LaMonica, Martin
- Software startup iSpheres announced a new programming language that will allow easy development of event-driven enterprise applications, where information is updated to users immediately. The California Institute of Technology spinoff says event programming language (EPL) is meant for applications that detect fraud, enable real-time trading, guard networks, or monitor incoming RFID data. Analyst Roy Schulte says such event-driven applications will become a business imperative in the coming years because they inform decision-makers about the most relevant and up-to-date company information. EPL is based on Defense Department research done for command and control systems, and is compatible with existing middleware and development tools. Companies will have to set up a special event-processing server that will use iSpheres' server version EPL, for which it will charge a royalty; the EPL development language is royalty-free and the company plans to submit it to a standards body by the end of the year. ISpheres' Gary Ebersole says EPL can do in 10 or 20 lines of code what it would take 100 lines of Java code to accomplish. Forrester Research has said enterprise development is more difficult than it should be, and IBM and Tibco Software are working on event-processing software and languages that will address growing needs: IBM's common event infrastructure has already been submitted to the Organization for the Advancement of Structured Information Standards (OASIS). Meanwhile, Schulte says some event-driven language will be paired with the developing business process execution language (BPEL) to dramatically impact the industry.
Click Here to View Full Article
From ACM's TechNews, October 13, 2004
"The Quest for Secure Code"
Globe and Mail (CAN) (10/12/04); Kirwan, Mary
- Poor software quality is responsible for every one of the SANS Institute's top 20 Internet security vulnerabilities, yet universities still fail to teach proper coding techniques and government remains cowed by industry lobbying efforts. SANS Institute research director Alan Paller says evaluation and certification programs are needed to ensure that programmers have the proper training, and he notes that even universities appointed by the government to be "Centers of Excellence in Cybersecurity" do not require security courses for their IT graduates. Carnegie Mellon University computer science department head Jeannette Wing says even if students are taught more security, practical realities at the workplace will mean feature-focused code produced quickly, if that is what those students' employers desire. Meanwhile, millions of business customers are hindered by restrictive licenses from tweaking their software purchases. Microsoft emphasizes security during its interview process for prospective employees and evaluates workers on their ability to deliver quality code, but the company has a huge legacy infrastructure and backward compatibility issues, says Wing. The government has made many efforts to intervene and make vendors liable for their products, but have been met with hundreds of millions of dollars in lobbying efforts, notes Paller. Even attempts to make vendors liable with caps on potential damages has not worked, as IT industry lawyers are reluctant to admit that secure code is possible. Rep. Adam Putnam (R-Fla.) is expected to make a new push for legislation soon and is chair of the House subcommittee on cybersecurity policy, and the Federal Information Security Management Act is also expected to make a change as vendors cater to the $40 billion federal IT market.
Click Here to View Full Article
From ACM's TechNews, October 4, 2004
"Looking for Patterns"
InformationWeek (09/27/04) No. 1007, P. 58; Babcock, Charles
- The father of object-oriented programming believes teamwork and collaboration, rather than working with an individual tool, will spur software development in the years to come. Grady Booch, chief scientist of IBM's Rational Software unit, says developers will produce "algorithmic snippets of code" that manipulate existing objects and create new ones, and he adds that there may be advances in "building new languages for connecting systems to systems." Booch says aspect-oriented programming is an early example of system-to-system communications is capable of. There have been no major developments in object-oriented languages, such as Java, C++, and Microsoft's C#, in years. Although Booch has had a heavy influence on modeling, he plans to take a pattern-based approach to development, considering underlying patterns sometimes exist between different systems. Java Blueprints, which shows best practices for building a Java application to interact with surrounding pieces of software, is an example of the practical use of patterns.
Click Here to View Full Article
From ACM's TechNews, September 17, 2004
"Digital Alchemy"
Technology Review (09/17/04); Hellweg, Eric
- The computing industry has been pursuing the vision of software emulation for nearly three decades, but such efforts have yielded few results with wider applications beyond enabling one specific program to run on one other kind of processor. Furthermore, these products are often characterized by significant performance degradation. Transitive Software, a startup company, claims to have made a breakthrough with Quick Transit, a program that reportedly "allows software applications compiled for one processor and operating system to run on another processor and operating system without any source code or binary changes" with minimal performance degradation. Transitive CEO Bob Wiederhold explains that his company has been developing Quick Transit for nine years, and its announcement has been met with both excitement and skepticism. If Quick Transit actually does deliver on its promises, it could make the migration of a company's software to new hardware simpler and less expensive, an essential step for firms that want to upgrade or switch servers. The emulation program could also allow a single server to run multiple tasks, facilitating server consolidation and lessening budget overhead and management headaches. Transitive says a half-dozen as-yet-unidentified companies have signed up for Quick Transit, and Wiederhold expects the first customer announcement to be issued in upcoming months. The emulator will be initially marketed to large computer makers, although the technology could eventually penetrate consumer markets such as video games.
Click Here to View Full Article
From ACM's TechNews, September 13, 2004
"Searching for Substance: The Road to Safe Software"
InformIT (09/03/04); McFarlane, Nigel
- Nigel McFarlane writes that commercial software providers offer no guarantee of a software's quality--its reliability, security, usability, etc.--to consumers, but he sees a ray of hope in open-source software development practices. He notes in his study of the closed commercial software development process that independent peer review of the product in the critical stage between academic research and market acceptance testing is often paltry, thus offering an entry point for defects in both intent and execution. The transparency of open-source software, however, supports peer-review processes for defects in execution, and McFarlane argues that closed commercial software will never equal open source on such defects without sufficient investment to offset the weaker review processes. The author observes that the consumer benefits of open-source processes are threatened by oft-abused "rubber stamps" such as due diligence legislation that can bog new players down in a bureaucratic quagmire. McFarlane contends that rubber stamp equivalents must be subjected to independent peer review if consumers are to continue to gain from the peer-review process embodied by open source. "We need academics to argue what the leading indicators for software defects should be, and we need them to resist the special-interest arguments of those that would distort or avoid the statistics," he writes. "We need independent bodies providing regimes, audits, and tests that can be used to produce indicators of anyone's software." The results of these tests must be transparently disclosed, and the strangulation of innovation must be avoided through a graduated system that does not set up any obstacles to entry.
Click Here to View Full Article
From
ACM's TechNews, August 18, 2004
"Linux Skills in High Demand as IT Jobs Pick Up"
IT Management (08/11/04); Gaudin, Sharon
- Estimates from Dice.com, a tech professional job board, indicate significant growth in demand from employers for people with Linux skills and experience: Dice CEO Scott Melland says that job listings calling for Linux skills have risen 190 percent over the past 12 months, from about 25,000 jobs this time last year to over 50,000 jobs. The number of Linux-related jobs listed on Dice.com has also skyrocketed from between 860 and 900 last year to 2,500 currently. Seventy percent of those Linux job listings call for developers, 20 percent to 25 percent call for systems administrators, and the remainder are for people with blended skills; Melland explains that employers appear to be placing greater value on on-the-job experience than certifications. "The demand for Linux skills is absolutely growing and it's growing faster than the overall demand for tech professionals," Melland reports. The gradual economic recovery is prompting companies to start looking into either converting older systems to Linux or constructing new systems. The growth of salaries for Linux-related IT professionals is concurrent with growth in Linux-related jobs. The average listed salary for a Linux-related IT job is $67,000, a 6 percent gain over wages in other IT fields, while contractors and consultants working with Linux can expect to earn roughly $87,000, on average. Melland concludes that the IT job market has shown remarkable growth over the last year, to the degree that opportunities for tech professionals have doubled.
Click
Here to View Full Article
From
ACM's TechNews, August 6, 2004
"Programming Wetware"
InformIT (08/06/04); Wolfson, Wendy
- Various projects in developing biological computers are underway, although the obstacles are formidable. Researchers say the value of DNA-based computers is their ability to perform massively parallel problem-solving, but Defense Advanced Research Projects Agency program manager Eric Eisenstadt says the key challenge is creating a biological algorithm to symbolize a real-life optimization problem; his agency is considering organic modeling and simulation for the purpose of detecting biological and chemical agents, while the National Science Foundation's Biological Information Technology and Systems program aims to tackle a wide variety of computational challenges by supporting research at the convergence point of biology and information technology. In April, the Weizmann Institute in Israel issued press releases declaring that Professor Ehud Shapiro's lab had developed a biological computer that uses molecules within living cells to identify certain cancers and to generate cancer -fighting pharmaceuticals. This would be a step toward drug delivery systems that would destroy cancer cells in the human body while leaving healthy cells alone. Meanwhile, the field of synthetic biology is investigating the assembly of simple circuits from biological molecules in the hopes of creating simple, programmable organisms. DNA would serve as a biological computer's "software," while enzymes would function as its "hardware;" circuits composed out of lengths of DNA would facilitate cell-to-cell communication. Standardizing such biological building blocks is the goal of an MIT project focusing on the creation of circuits and other elements fabricated from DNA lengths. Problems that need to be solved in order to perfect biological computing include: Very slow input/output time, the need for a sterile environment to avoid contamination by other microorganisms, and the failure of computers after several generations because of cell division, mutation, and other biological processes.
Click
Here to View Full Article
From
ACM's TechNews, August 4, 2004
"Preview: LinuxWorld to Highlight Desktop Linux, Security"
Computerworld (08/02/04); Weiss, Todd R.; Rosencrance, Linda
- Approximately 11,000 people are expected to attend the LinuxWorld Conference
& Expo in San Francisco this week, where a chief topic will be Linux's
penetration of desktop systems. Linux products and implementation strategies
will be showcased by over 190 exhibitors, including IBM, Red Hat,
Intel, Hewlett-Packard, Dell, and Oracle. IDG World Expo President Warwick
Davies observes that Linux is growing in popularity, and IDC analyst
Dan Kusnetsky notes that Linux could be seen as a feasible
alternative by consumers who want to move away from Microsoft's
Windows operating system, provided Linux can support email as well as
access to Web-based applications and the Internet. He adds that
Linux could also become appealing to developers of platform-neutral
software, as long as they can create Java-based applications and
Web services using proper tools and at reasonable prices.
Kusnetsky reasons that many workers' requirements could be
fulfilled with an operating system that supplies common
applications (Web browser, email agent, Java virtual machine, etc.),
so organizations could give them a system in which Linux functions as an
underlying client operating environment for either client/server or Web
server applications. "The whole industry is starting to wake
up to the possibility of Linux on the desktop," boasts Illuminata
analyst Jonathan Eunice, although he cautions that Linux is at an
early stage. Davies says there will be increased emphasis on
Linux security at LinuxWorld, as well as discussion on incorporating
Linux into corporate environments without jettisoning existing corporate
systems. IDC expects Linux's market share to expand from 2.7 percent
with 3.4 million paid license shipments in 2002 to 6 percent with over
10 million shipments in 2007, according to Kusnetsky.
Click
Here to View Full Article
From
ACM's TechNews, August 4, 2004
"In Competitive Move, IBM Puts Code in Public Domain"
New York Times (08/03/04) P. C5; Lohr, Steve
- In an effort to help simplify the writing of Java applications,
IBM will today announce a contribution of over 500,000 lines of
proprietary software code to an open source software group. "We hope
to spur the further development of the Java community," declares Janet
Perna, IBM's general manager for data management software. The Apache
Software Foundation will receive the code for the Java-based Cloudscape
database, which is valued at $85 million. Any growth in Java applications
fueled by this move will almost certainly benefit IBM by providing more
potential uses for its WebSphere platform, which runs and manages such
applications in direct competition with Microsoft's .Net platform.
Cloudscape is designed to be used as a single database contained
within a software application rather than a complete database program
that functions by itself in corporate data centers; Java specialists believe
the Cloudscape code could be very appealing as a basic Java database.
Apache will rename the Cloudscape database Derby, and retain licensing
and intellectual property rights to the code. Industry analysts call this
development a further illustration of IBM's support for open source
projects to which it has contributed staff, marketing dollars, and code.
Forrester Research analyst Mike Gilpin says, "The Cloudscape code is not
a major factor in IBM's overall platform strategy. So this makes sense for IBM."
Click
Here to View Full Article (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)
From
ACM's TechNews, July 28, 2004
"Researchers Aim for Plain-English Debugging"
Associated Press (07/26/04); Crissey, Mike
- The National Science Foundation has invested $1.2 million in Carnegie Mellon
University professor Brad Myers and grad student Andrew Ko's Whyline
(Workspace for Helping You Link Instructions to Numbers and Events)
project, a debugging program that enables users to ask questions
about computer glitches in plain English. If a program appears to
go wrong during the testing process, a user can hit a "Why" button,
halting the program and causing questions based on programmed events
to appear while highlighting lines of programming code related to
the question in a window. A second window displays what happened while
the program was in operation, complete with a timeline and flow chart.
Myers and Ko tested Whyline with a group of grad students whose programming
experience levels ranged from neophyte to expert, and found that users could
detect bugs eight times faster and do 40 percent more programming with the program.
So far Whyline has only been employed to debug programs in Alice, an academic
programming language that renders interactive 3D worlds using a limited
command vocabulary. A program's complexity determines how easy or difficult
it is for Whyline to present the correct questions and answers, while its
helpfulness could be limited even further by adding Whyline to an even more
complex language, such as Java. Whyline is a component of the national End
Users Shaping Effective Software (EUSES) initiative, whose goal is to make
computers more user-friendly by fundamentally changing their appearance and
operation. Professor Andreas Zeller of Germany's Saarland University has developed
AskIgor, a Web-based debugging program that attempts to explain the cause of
program errors once programmers tell it when the program works and when it malfunctions.
Click
Here to View Full Article
From
ACM's TechNews, July 19, 2004
"Open-Source Is More Than Just Linux"
Associated Press (07/19/04); Fordahl, Matthew
- Experts say open source software is fundamentally changing the
computer industry, forcing businesses to rethink strategies and enabling
new companies to upset established ones. Technology publisher Tim O'Reilly
says companies able to understand and act on this shift will reap the benefits,
similar to how Intel and Microsoft profited from open hardware standards in the
1980s. Many of the most famous open source projects got started in the last
decade and have roots in academia: The Apache Web server emerged from the
National Center for Supercomputing Applications at the University of Illinois,
the same as the Mosaic Web browser and many of the early Netscape Communications
developers; the institution made the server source code available to anyone,
and Brian Behlendorf and seven other programmers started the Apache project
after sharing patches for the server via an email discussion group. Open source
software usually does not indemnify its end users from possible patent infringement,
and companies such as SCO Group have sued several large users of Linux. Sendmail,
the email server claiming about 40 percent of the worldwide market, incorporated
into a company that does provide legal protection to its users. Although Sendmail
adheres to open source principles, co-founder and Chairman Greg Olson says the
company's allegiance to open source is not political, but based on business
realities. He says open source provides innovation and a good standards process,
while MetaGroup analyst Corey Ferengul says the conversion of many companies to
open source is not altruistic, but the result of software commoditization.
Companies want to sell more products, and find open source platforms
serve as a basis for their other offerings.
Click
Here to View Full Article
From
ACM's TechNews, July 16, 2004
"Father of Visual Basic Begs: Stop the Insanity!"
Pittsburgh Post-Gazette (07/09/04); Greiner, Lynn
- In his book, "The Inmates Are Running the Asylum: Why High-Tech Products
Drive Us Crazy and How to Restore the Sanity," Visual Basic architect
Alan Cooper argues that software has become excessively and
unreasonably complicated because programmers and engineers are
designing it for their peers rather than for end users. He contends
that "In our rush to accept the many benefits of the silicon chip,
we [business executives] have abdicated our responsibilities" and
put engineers in charge of the high-tech industry. Cooper says
the programmers' goal is for the programming process to be
simple and seamless, whereas users desire simple and seamless software
operation. He categorizes users as either apologists or survivors,
the former being users who do not mind software's problems because
they relish challenge, and the latter being those who have little
knowledge about computers yet have little choice but to cope with
bad software. The first half of the book is dedicated to the
description and causes of bad software, and Cooper asserts that
allowing software engineers to tackle software failures is akin
to "asking the fox to solve henhouse security problems." His preferred
solution is interaction design, a process that involves building
biographies, needs, and skills of probable users of software
under development and customizing the interface and feature set
to their requirements. Certain areas of Cooper's book, which is
expanded from its original 1999 edition, are in need of updating,
as it mentions some problems that have already been tackled.
Click
Here to View Full Article
From
ACM's TechNews, July 7, 2004
"Programming Doesn't Begin to Define Computer Science"
ITBusiness.ca (07/04/04); Morris, Jim
- Jim Morris, computer science professor and Dean of Carnegie Mellon
University's West Coast Campus, writes that the fall-off in
college-level computer science enrollments is chiefly due to
a misrepresentation of the field's goals: The computing industry's
cyclical boom-bust pattern owes a lot to students buying into the
idea that computer science can make them wealthy, only to be
discouraged by the bursting of the tech bubble and the offshoring
of computer jobs. Morris notes that there has been a 70 percent
increase in the number of biological science graduates since 1990,
compared to a 10 percent decline in computer science graduates,
which proves that traditional sciences, with their intellectual
stimulation and humanitarian goals, are much more appealing.
The author asserts that the science of computing is the missing
ingredient in current computer science education, and says the
problems computer science aims to solve have practical applications
with significant social and economic impact. For instance, solving
the problem of reproducing intelligence in machines is critical
to the advancement of robotics, which in turn will play a key role
in space exploration. Likewise, computer security cannot improve
without innovative mathematics, and people can become skilled
in experimental technique by studying how humans interact with
computers. Furthermore, Morris points out that a computer
science education can prepare people for careers in diverse
fields, such as medicine, law, biology, and business. The author
believes that students should obtain a "liberal science" education
with computing as their first focus of study. Morris writes that
students should receive education in certain computer sciences
before college, but notes that high school curriculums currently
lack instruction in "the visions and grand challenges of computer science."
Click
Here to View Full Article
From ACM's
TechNews, July 2, 2004
"Is Java Cooling Off?" CNet
(06/28/04); LaMonica, Martin
- This week's JavaOne conference highlights the
growing discontent over Sun Microsystems' handling of
the cross-platform programming language. Many
companies and legions of corporate developers depend
on Java as an alternative to Microsoft's .Net and
development tools, but the business difficulties Sun
is facing have carried over into its management of
Java technology. The Sun-controlled Java Community
Process (JCP) has been challenged on a number of
fronts this year, with IBM and BEA going directly to
open-source groups to standardize back-end Java
features, and competition from the open-source Eclipse
foundation. Disunity in the Java community is
especially harmful with Microsoft's strong push in
.Net offerings and the quick adoption of new
enterprise architecture aspects such as Web services,
XML, and the Linux operating system; Java needs to be
on the forefront of those emerging technologies in
order to remain important in the future, says
Forrester Research analyst John Rymer. IBM has called
for Sun to make Java open source in order to
strengthen its competitiveness, and WebSphere
infrastructure director Bob Sutor says an open-source
Java implementation bundled with Linux distributions
would dramatically increase the programming language's
popularity. The open-source community is faster in
approving standards and brings more resources to bear
than Sun's JCP, but Sun executives seem only to be
thinking about how to grow their Java-derived revenue,
says Rick Ross, founder of Java developer Web site the
Java lobby. The JavaOne conference will feature a
panel discussion concerning Java's open-source future,
and Sun software head John Loiacono says the company
is working toward that goal, although he has not said
when it will happen. He says the main concern for Sun
and its customers is that Java does not fragment. Click
Here to View Full Article
From ACM's
TechNews, June 30, 2004
"When Standards Don't Apply" CNet
(06/29/04); Becker, David
- Some common computing formats are not formally
ratified as standards, yet have achieved de facto
standard status because of their popularity, and
software vendors that control these popular formats
say standardization would inhibit their ability to
keep pace with changing technology. Some de facto
standards have proven very successful despite their
lack of formal approval, such as the Perl Web
programming language, says Sun Microsystems software
expert and XML co-inventor Tim Bray. In the case of
Microsoft, the company's lock on the .doc and .xls
extensions for Word and Excel documents has helped
keep the Office software suite dominant, while recent
government pressure has led Microsoft to publish XML
schemas that enable users to save their Office
documents in that open standard. Although the European
Union praised Microsoft for the compromise, it
suggested the XML schemas also be submitted to a
standard body for safekeeping. Adobe is somewhat more
open with PDF, freely publishing the specification and
thus allowing for hundreds of derivative tools, but
PDF still remains firmly in Adobe control, which the
company says is necessary in order to add new
capabilities quickly, such as the recent incorporation
of barcodes. Macromedia made its Flash animation
popular by freely publishing the technology in the
late 1990s, but now makes so little money from the
technology some fear it could begin to lag in its
upkeep. Scalable vector graphics (SVG) is a competitor
to Flash under the control of the World Wide Web
Consortium, and open-source expert Bruce Perens says
that all browsers will eventually include SVG plug-ins
and make Flash irrelevant. Dave Winer has kept his
really simple syndication (RSS) publishing protocol
from standards bodies because he fears they would
complicate the technology, which he says must remain
sparse; people who disagree with Winer are flocking
around Atom, which could replace RSS but also works
alongside Winer's format. Click
Here to View Full Article
From ACM's
TechNews, June 7, 2004
"Finding Out What's Bugging
Computers" The Star-Ledger (06/06/04);
Coughlin, Kevin
- One solution to the rampant problem of buggy
software is to test the code thoroughly before its
public release, which is a much more complex endeavor
than it sounds. "There's not enough time in the life
of the universe to check one million lines of code,"
explains Bell Labs director of systems software
research Howard Trickey, whose Orion project aims to
streamline the testing process by flagging only
genuine errors. A 2002 National Institute of Standards
and Technology study attributes software's general
shoddiness to an accelerated time-to-market and a lack
of liability among vendors. "We as buyers don't put
the market pressure on enough to push for bug-free
software," comments Stevens Institute of Technology
computer scientist Rebecca Wright, while Eugene
Spafford of Purdue University reports that the problem
is complicated by vendors adding more and more
sophisticated features to software. He further points
out that the likelihood of bugs cropping up is rising
now that software is being increasingly geared for
computer neophytes, while colleges are caving to
employer demands for people skilled in the latest
computer languages rather than older ones with greater
reliability. Orion is designed to be an improvement
over other programs by lowering the occurrence of
false positives, and Trickey and his research team
assert that they successfully reduced the number of
false positives in checking a 369,000-line program by
50 percent. The Orion project has received a four-year
grant of $640,000 from NASA and the National Science
Foundation. Computer security analyst and author
Michael Erbschloe argues that Orion and similar
approaches will not eliminate the need to test
software in real-life scenarios, and explains that
this task must be carried out by rotating pools of
test users of varying competence. Click
Here to View Full Article
From ACM's
TechNews, May 7, 2004
"Helping Exterminate Bugs in Spreadsheets, Web
Applications" Newswise (05/05/04)
- The National Science Foundation has awarded a
five-year, $2.6 million Information Technology
Research grant to the End Users Shaping Effective
Software (EUSES) project, a six-campus initiative to
help eliminate glitches in spreadsheets and Web
applications developed by "end-user programmers."
Experts reckon that there will be 55 million such
programmers by next year, and believe that almost 50
percent of the programs they create will be infested
by bugs. Oregon State University computer science
professor and EUSES director Margaret Burnett says the
project lives by the philosophy of helping end users
improve their programming habits as unobtrusively as
possible. Burnett and other Oregon State colleagues
presented a paper at ACM's recent CHI 2004 conference
in which they compared different techniques of
notifying spreadsheet programmers that they may have
created buggy code; their conclusion was that
"negotiated" interruptions (similar to the automatic
underlining of misspellings by a word processor) were
better than immediate interruptions (such as pop-up
error windows). "We learned: Stay out of
[programmers'] way, give them hints to explore and
they'll get more done," Burnett notes. Carnegie Mellon
EUSES researchers Andrew Ko and Brad Myers presented a
separate report describing a unique debugging
interface that asks programmers questions about "why
did" or "why didn't" something happen; users were able
to find errors eight times faster and make 40 percent
more programming progress. Myers comments that current
debugging tools, which date back to the 1940s, are
overdue for a serious upgrade. Meanwhile, EUSES
researchers at Drexel University, Penn State, Oregon
State, and Cambridge University are trying to gain
insight into end-user programmers' mind set through
observation, while another Oregon State-based EUSES
effort is focusing on the development of summer
science and technology workshops for middle- and
high-school teachers and students. Click
Here to View Full Article
From ACM's
TechNews, May 3, 2004
"Why Certifying IT Workers Won't
Help" ZDNet (04/27/04); Brampton,
Martin
- The Royal Academy of Engineering and the British
Computer Society say that a lot of IT money is wasted
due to a lack of professionalism in software
engineering, and argue for a professional IT community
accredited by them. However, there is no evidence that
links a given examination with practical achievement,
and while the supposed status of engineers is
apparently a goal of some, the certification of
"engineer" does not automatically convey success,
writes consultant Martin Brampton. Large engineering
projects in other fields, such as the Millennium
Footbridge in London and the Concord supersonic plane,
have proven great failures in terms of being on time,
within budget, and even safe-- and those projects were
completed by well-certified professionals. In
addition, there is no real definition of what exactly
a certified software engineer should know. Educational
institutions might play a role, but the examination of
simple concepts such as using logical structure
instead of jumps in programming causes doubt. Edsger
Dijkstra observed successful programmers avoided goto
in his article, "Goto considered harmful," but
pointing out that linkage has not significantly
improved the work of unsuccessful programmers. The
main reason why software engineering projects fail is
not the fault of the programmers themselves, but the
pace of the industry and the insistence on the part of
companies to release products without sufficient
testing. Then there are political issues that keep
some bad projects from being scrapped and started
over. The rapid pace of computer technology ensures
these types of problems will remain for some time in
software engineering, irrespective of IT
certification. Click
Here to View Full Article
"Federal Programmers Get Agile" Federal
Computer Week (04/26/04) Vol. 18, No. 12, P. 44;
Zyskowski, John
- Increasing numbers of government programming
projects are using agile development techniques, which
advocates claim support the generation of better
software in less time and for less money than
traditional development approaches. Agile development
methods, which are defined by the umbrella term "light
methodologies," discard processes, procedures, and
chores that slow conventional software development
such as formal documentation and centralized planning
and control. Agile development also involves more
frequent and direct customer feedback to programmers
so they can correct problems or modify software
according to clients' wishes before the product is
released. This means that software does not need
arduous revisions afterwards, explains U.S. Army
Environmental Center system analyst Dave Garrett, who
is using agile development to create software that
enhances environmental cleanup at military bases. His
team's strategy, which involves integrating agile
methods with automated software development tools, is
designed to roll out software faster while squeezing
the most out of frugal funding. Pair programming, in
which two programmers work together on joint tasks, is
another agile technique employed by Garrett's team; a
study finds that pair programming causes overall
programming cost to rise by a mere 15 percent or so,
and this increase is neutralized by the savings in
testing, quality assurance, and field support, among
other things. The government, which values
documentation and accountability very highly, might
seem an ill fit for agile development methods, and its
preference for IT contractors certified under the
Capability Maturity Model rating system of the
Software Engineering Institute would appear to support
this assertion. However, many experts believe that the
two seemingly incompatible techniques can complement
each other and yield benefits that combine the best of
both worlds. Click
Here to View Full Article
From ACM's
TechNews, April 28, 2004
"Is Programming Dead?" silicon.com
(04/23/04); Collins, Jon
- Model-driven architecture (MDA) is the next big
evolutionary step in programming, now that software
has standards for application definition such as the
Unified Modeling Language and application
architectures such as .Net and J2EE. Over the past
decades, programming had advanced incrementally as
software was ported from platform to platform; MDA
would make that laborious and unproductive task
unnecessary, allowing programmers to focus on more
important aspects, such as business logic. Several
vendors are already touting tools that will allow
companies to model applications and use code
generating tools, including Borland and IBM's Rational
unit. This type of capability will force better
designed applications, since irresponsible development
groups will not have an excuse for not finishing
specifications and designs upfront. Smaller vendors
such as Quovadx and Select Business Solutions in the
United Kingdom already tout MDA solutions that promise
to automate the majority of code production. In the
future, MDA can expect stiff resistance from some
sectors who will correctly point out the need for
expert programmers to create high-throughput code; but
if automatic code generation tools can fill in the
majority of application code with tolerable
performance, then MDA will certainly benefit
companies. In addition, MDA promises to incorporate
functions many programmers are unfamiliar with, such
as standardized event logging for autonomic computing.
The big picture view is that application vendors will
be responsible for specifying functional components
while customers will use MDA to put those components
together without having to worry about programming
gruntwork, writes Quocirca analyst Jon Collins. Click
Here to View Full Article
"Dire Straits" Information Security
(04/04) Vol. 7, No. 4, P. 36; McGraw, Gary; Hoglund,
Greg
- Software's future evolution will unfold according
to the emergence of seven major trends: The
elimination of bloated operating systems; the
development of components and objects; the advent of
mobile code; the normalization of distributed
computation; a shift in payment models; the spread of
embedded systems; and the widespread adoption of
wireless networks. These trends will cultivate new
business opportunities, but will also worsen the
primary factors responsible for the insecurity of
software--complexity, extensibility, and connectivity.
As complex OSes disappear thanks to the emergence of
encapsulation models such as virtual machines (VMs),
so too will attackers' ability to exploit the OS by
going after a deeply integrated app. Component-based
software will allow applications to be built as
needed, but new software exploits could be fostered by
the advent of networked-enabled everyday devices with
embedded systems. The emergence of mobile code carries
a high degree of concern for security exploits, and
all code is expected to become mobile as networking
becomes more pervasive; language-based security models
will grow in importance, and assaults against these
measures will take place in the wild. As distributed
computation is normalized and logically distributed
systems make the transition to geographically
distributed systems, several sub-trends are expected
to unfold: An increase in security issues as
distributed systems start to depend on the network for
communications between components, and regular
man-in-the-middle and timing attacks. Paying for
software functionality on an as-needed basis promises
to turn computing into a utility, but leaves digital
content open to theft, a problem that may have no
technical solution. The proliferation of embedded
systems into handhelds that are notoriously insecure
represents a good opportunity for exploiters, and the
mass adoption of wireless systems will erase the
physical boundaries of network segments, triggering
more security concerns as wireless components are
incorporated into more business-critical apps. Click
Here to View Full Article
From ACM's
TechNews, March 31, 2004
"Researchers Question I.T. Subcultural
Values" NewsFactor Network (03/29/04);
Martin, Mike
- Over three-quarters of IT projects are doomed to
failure not because of complexity, usability, and new
technology's unpredictability, as many people assume,
but because an IT occupational subculture often
clashes with users and managers, according to Jeffrey
Stanton of Syracuse University. "Occupational
subcultures are groups of individuals who, based upon
their occupation, develop their own language, values
and behaviors that distinguish them from other groups
within an organization," Stanton observes. His
conclusion that subcultural conflicts were responsible
for failed IT projects is based on 18 months of
research with Kathryn Stam encompassing 12 New York
organizations that were implementing major tech
projects, and over 100 interviews. The technical
jargon IT staffers use acts as a barrier that blocks
outsiders' access to knowledge, while another source
of frustration is IT personnel's tendency to explain
things too rapidly. "A tech problem often seems
routine to the IT worker, but--as one user said--it
doesn't help 'when they come in and go zoom, zoom,
zoom, zip, zip, zip with a mouse and they've totally
lost me, so I never learn anything,'" notes Stanton.
IT workers, on the other hand, often harbor feelings
that their contributions are underappreciated by
management and end users. Clif Boutelle of the Society
for Industrial and Organizational Psychology attests
that outsiders respect IT workers who are helpful,
responsive to problems, and are "good teachers,"
although Stam reports that many IT personnel she and
Stanton interviewed expressed a reluctance to
nursemaid non-IT personnel. Stanton says his research
emphasizes the need for organizations to close
subcultural gaps, but cautions that assigning blame is
more likely to exacerbate the situation. A more
effective solution is acknowledging and understanding
subcultures within an organization. Click
Here to View Full Article
From ACM's
TechNews, March 29, 2004
"Rising to the Software
Challenge" VNUNet (03/26/04); Hoare, Sir
Tony
- A major challenge for scientific research in
computing is establishing a connection between the
theory of programming and the products of software
engineering practice, writes Microsoft Research
professor Sir Tony Hoare; he says this task could be
accomplished by organizing a grand challenge project
to motivate programming scientists to reach such a
goal through collaboration and competition. There are
many instances throughout scientific history in which
the drive toward such a challenge has led to
significant breakthroughs, an example in the field of
computer science being the development of a chess
program that was able to play and eventually trounce
an international champion. Hoare comments that similar
challenges in computing research remain, one being the
construction, assessment, and implementation of a
high-level programming language compiler to confirm
that every program it compiles performs in the correct
manner. The development of such a compiler would allow
many existing programming errors to be prevented; the
compiler would be built out of technologies already
embedded in software engineering tools. Hoare writes
that the elements for developing the compiler exist.
"We just need some way of weaving them together into a
single program development and analysis tool, for use
first by researchers, and later by professional
programmers," he maintains. Most grand challenges are
characterized by broad international participation,
rapid and open publication, and the need to solve a
fundamental puzzle that resides at the heart of a
scientific discipline. Click
Here to View Full Article
"Extra Headaches of Securing
XML" CNet (03/29/04); LaMonica, Martin
- Web services pose a serious security threat to
enterprise networks given the lack of understanding
about XML among security professionals and the low
profile security has played in Web services deployment
to date. Current Web services implementations are not
much of a threat since companies have used the
technology to connect trusted entities such as
internal departments and business partners, but as Web
services grows in use and applications become more
powerful, businesses will have to update their
security apparatuses. Hackers infiltrating corporate
networks via Web services interfaces could cause much
more damage than simply knocking out a Web site since
valuable business data is exposed on
business-to-business applications, according to
Gartner analyst Benoit Lheureux. XML messages are
encased in an IP envelope and most current network
inspection technology does not filter for fraudulent
XML content. Forrester Research analyst Randy Heffner
foresees XML denial-of-service attacks where systems
are deluged with too many messages, or XML documents
are manipulated to slow down the system; evidence that
such attacks are possible is already showing up, such
as with a recent SecurityFocus security bulletin
describing an XML External Entity attack, whereby an
incorrectly configured XML "parser" can be exploited
to gain network access or take down a network. Limited
deployment makes Web services networks a less
attractive target for now, but the high level of skill
required also serves as a deterrent, according to
Sarvega CEO Chris Darby. Sarvega is one of a handful
of new companies whose devices improve XML security or
speed processing. The Sarvega XML Guardian Security
Gateway helps encrypt XML files, enforce authorized
access policies, and create network activity logs. Click
Here to View Full Article
"Rules for Effective Source Code
Control" EarthWeb (03/25/04); Gunderloy,
Mike
- Larkware lead developer and author Mike Gunderloy
believes every software developer should use some form
of source code control, which promotes more harmonious
collaboration between team members and provides a
secure back-up file of code as well. When choosing the
best source code control system, there are a number of
factors to assess, including price (some systems cost
nothing, others can cost hundreds of dollars per
user); concurrent development style; the system's
repository (some systems store data in a database or
in the file system, which offer varying levels of
security and safety); Internet friendliness; IDE
integration; and cross-platform support. Gunderloy
lists items that developers might wish to consider
storing in their source code control system, such as
source code, database build scripts, Windows Installer
databases, graphical elements such as icons and
bitmaps, license files, Readme files, informal
development notes, test scripts, build scripts, help
files, and documentation. The author notes that
storing everything a developer needs in a single
repository makes it easier to handle support requests
for older versions of products, and adds that
developers should remember that copying or removing
files from source code control makes it more difficult
to re-create previous project versions. Gunderloy
recommends that developers change the repository
frequently and in small instances, which among things
maintains the usefulness of comments in the source
code control system. He says labels or tags are useful
since people are much better recalling names than
numerical codes; labels should be applied to a source
code control repository at specific times. Gunderloy
also posits that developers should create branches
whenever different developers in the same project are
working on code that diverges; branching can also be
used to investigate alternate coding approaches and
their consequences. Click
Here to View Full Article
From ACM
News, March 24, 2004
"Blueprint for Code
Automation" Computerworld (03/22/04) Vol.
32, No. 12, P. 25; Sliwa, Carol
- Model-driven architecture (MDA) is slowly gaining
acceptance among organizations that start with small
projects highlighting MDA's benefits: For instance,
the state of Wisconsin used MDA to develop a new
Web-based unemployment insurance benefit system,
laying down business process specifications on a
diagram instead of writing code. Proponents of MDA say
the approach provides a common language for business
and IT sides, more disciplined code, and cost-savings
from code reuse. MDA is sponsored by the Object
Management Group (OMG) and uses Unified Modeling
Language (UML) and other code-generating tools, and
OMG CEO Richard Soley says at least two companies are
creating all of their code with the MDA approach.
Business analysts and developers jointly map out how
the system will work using UML descriptions and create
a model, then the chosen code-generation tool
automatically generates implementation code. Wisconsin
Workforce Development project director Lee Carter says
another benefit of MDA is the ability to translate
code back into business requirements, making it far
easier for future development teams to understand the
thought behind the application; he adds that MDA
allowed the project to be completed much faster than
anticipated. IT outsourcing provider PFPC reports
similar gains in efficiency as well as accuracy and
fewer hidden design flaws, while PFPC senior architect
Ian Muang says developers in different locations can
also work together more easily with MDA. This
commonality might also facilitate simpler switches in
runtime platforms since business and application logic
is first defined in a platform-independent model (PIM)
before being run through the code-generation tool. UML
2.0 should also provide more flexible and effective
action semantics. Experts note, however, that MDA has
serious drawbacks, including significant cultural
resistance and tools that do not adhere to standards.
Click
Here to View Full Article
From ACM
News, March 22, 2004
"Why Software Quality
Matters" Baseline (03/04) Vol. 1, No. 28, P.
32; Gage, Deborah; McCormick, John; Thayer, Berta
Ramona
- Software errors that led to the deaths of
Panamanian cancer patients from overexposure to
radiation--and criminal prosecution against the
technicians who used the software--illustrates the
vital need to anticipate and remove glitches before
they become a problem with potentially fatal
consequences. Other deadly incidents attributed to
buggy software include the 2000 crash of an Marine
Corps Osprey tilt-rotor aircraft that left no
survivors; the shooting down of friendly aircraft by
the Patriot Missile System in Operation Iraqi Freedom;
and at least three fatalities resulting from last
summer's East Coast power outage. Among the reasons
given for bad software is a flawed programming model,
poorly thought-out designs, lax testing procedures,
and the unpredictability of program interaction. The
FDA distributes "guidance" documents suggesting that
software manufacturers comply with generally-accepted
software development specifications, keep tabs on
design specs, and formally review and test the code
they create--but without making any specific
recommendations. The Panama incident is causing some
industry experts to consider the possibility that more
stringent regulation of software development is
necessary. Observers note that not only are software
development regulatory agencies few in number, but
existing agencies such as the FDA do not go far enough
to ensure quality software. For instance, the FDA
approves products under either premarket approval or
premarket notification. The former mechanism applies
to dramatically unique technologies that are subjected
to rigorous testing, while the latter applies to
products that fit into existing device categories, and
do not require FDA or corporate trials to be approved;
the Multidata Systems International software
responsible for the radiation overdoses in Panama was
certified under the premarket notification process.
SCC director William Guttman estimates that there
could be 20 to 30 bugs for every 1,000 lines of code
generated by corporate programmers or commercial
software manufacturers. Click
Here to View Full Article
From ACM
News, March 19, 2004
"Q&A: Quality Software Means More Secure
Software" Computerworld (03/17/04);
Willoughby, Mark
- Cigital CTO and author Gary McGraw posits that
software quality and software security are inexorably
linked, and though he acknowledges that the software
industry has started to take software security more
seriously, he cautions that "they have a long way to
go." McGraw says the security, reliability,
dependability, and availability of software must be
considered throughout every phase of the software's
life cycle, arguing that the earlier a flaw is
detected, the less risk it will pose. McGraw puts
software problems into two categories, bugs and flaws,
and contends that people are devoting too much
concentration to the former and not enough on the
latter. McGraw says that common "garden variety"
hacker exploits can be remedied with solid software
processes, but static analysis will not fix more
fundamental software flaws. The strategy he advocates
is to start with an outstanding software process, and
overlay it with security best practices; McGraw lists
abuse cases, requirements analysis for security, and
risk analysis for design as examples of such
practices. In talking about his and Greg Hoglund's new
book, "Exploiting Software: How to Break Code," McGraw
maintains that the tome "makes it clear that the way
you break a system is not about attacking security
features but by figuring out what assumptions a
designer or coder has made and making those
assumptions go away." The CTO cites the work of USC
security guru Barry Boehm, who determined that about
50 percent of software problems occur at the
requirements level, approximately 25 percent crop up
at the design level, and only a few are caused at the
coding level; McGraw adds that finding and fixing
problems at the requirements level carries far less
cost than fixing problems in the field, and notes that
software patches are being exploited by hackers to
find security holes. Click
Here to View Full Article
"Tough Road to Quality
Code" InformationWeek (03/15/04) No. 980, P.
51; Sullivan, Laurie
- Because the reliability of automotive software can
impact an automaker's reputation among consumers,
manufacturers are facing increasing pressure to ensure
that such software is free of errors. Automakers are
altering their operations to address the industry's
adoption of software by more closely inspecting
suppliers to make sure that their software development
and testing procedures are rigorous, for example.
Chrysler VP Mark Chernoby points out that glitchy
software can hurt customer satisfaction and slow down
manufacturing cycles, which leads to cost
elevation--while the growing presence of software in
vehicles raises the likelihood of additional quality
errors and business risks. Experts claim in-vehicle
software quality appears to be improving, a conclusion
apparently borne out by statistical data; however,
software-quality standards organizations are currently
pursuing separate standardization efforts, and this
lack of unity could seriously hamper manufacturers'
rollout of software-driven improvements. In-vehicle
software is also growing in complexity: Whereas
software was previously designed to augment the
performance of a particular function, some
contemporary cars boast integrated software that
controls multiple automotive functions. In addition,
drive-by-wire systems employing software rather than
physical mechanisms to control braking and steering
have started to show up. Engineers note the
possibility of software errors being introduced during
the testing stage--for instance, leaving debugging
code in the finished vehicle is risky, since it could
disrupt the function of another device. Automatically
generated software code is appealing to the auto
industry because it could theoretically keep errors to
a minimum if not eliminate them completely; the
trade-off to this approach is a lack of human-style
thinking, which is important to testing for the
real-world effectiveness of code and feature
development. Click
Here to View Full Article
From ACM
News, March 19, 2004
"Why Software Still
Stinks" Salon.com (03/19/04); Rosenberg,
Scott
- Seven out of 19 software pioneers interviewed by
Microsoft Press editor Susan Lammers 20 years ago for
her book "Programmers at Work" were reunited at a
March 16 panel to discuss the current state of
software, and all of them observed that software is
still excessively complicated for both programmers and
users. "Software as we know it is the bottleneck on
the digital horn of plenty," argued International
Software founder and 20-year Microsoft code guru
Charles Simonyi, but the panelists offered differing
and conflicting approaches to address this problem.
Simonyi believes that software design and engineering
should be separated; he says giving designers the
means to sculpt software will cultivate a programming
"silver bullet" that can eliminate the most formidable
challenges of many software development initiatives.
Virtual-reality pioneer Jaron Lanier, on the other
hand, proposed a more radical solution: Removing the
brittleness of software by modeling it after
biological systems, a field known as biomimetics.
Lanier and Macintosh pioneer Jef Raskin disparaged the
open-source software movement as devoting too much
attention to the sharing of code and not enough to the
code itself, but Macintosh operating system author
Andy Hertzfeld defended open source, claiming that
developers do it "because they want people to use the
stuff!" The panelists found hope for reinventing
software in technologies such as RF tags, global
networking, and mobile telephony; Ashton-Tate
Framework creator Robert Carr noted that the average
cycle of technology transformations is 20 years, and
the Internet has only reached the halfway mark.
Rethinking programming and designing better software
cannot be accomplished unless software pioneers become
more willing to share their work. Other participating
panelists included gaming pioneer Scott Kim and
VisiCalc co-creator Dan Bricklin. Click
Here to View Full Article
From ACM
News, March 17, 2004
"Closing In on the Perfect Code" IEEE
Spectrum (03/04) Vol. 41, No. 3, P. 36; Guizzo,
Erico
- With turbo codes, French researchers Claude Berrou
and Alain Glavieux put an end to over four decades of
speculation on whether data could indeed by conveyed
at speeds up to channel capacity virtually devoid of
errors and with very low transmitting power using the
right error-correction codes, as electrical engineer
Claude Shannon theorized. Shannon postulated that the
noise bedeviling all communication channels could be
circumvented by separating data into strings of bits
and adding to such strings a set of "parity bits" that
would help detect and fix errors at the receiving end,
resulting in a group of bits or codeword that in the
right combination with other codewords would make
channel capacity achievable. However, Shannon
hypothesized that attaining capacity required the
random selection of infinitely long codewords, an
impractical solution. The complexity problem is
addressed by turbo codes, which divide the problem
into more manageable units: The scheme eschews the
single encoder/decoder setup at the sending and
receiving ends in favor of a double encoder/decoder
architecture that works in parallel. "What turbo codes
do internally is to come up with bit decisions along
with reliabilities that the bit decisions are
correct," explains Bell Labs researcher David Garrett.
In Japan, turbo codes have been bundled into the
standards for third-generation mobile phone systems,
and Hirohito Suda of NTT DoCoMo's Radio Signal
Processing Laboratory reports that the codes are used
to transmit pictures, video, and mail to cell phones
and other portable devices. The European Space Agency
and NASA have deployed or are planning to deploy turbo
codes in their space missions, while digital audio
broadcasting and satellite links are about to embed
the codes within their systems. In addition, turbo
codes could aid engineers working to mitigate
communication problems such as multipath propagation,
and inject new vitality into long-dormant codes such
as low-density parity check codes, which also reach
capacity via an iterative decoding technique. Click
Here to View Full Article
From ACM
News, March 15, 2004
"Return of the Homebrew Coder" Economist
Technology Quarterly (03/04) Vol. 370, No. 8366,
P. 10
- Henry Ford's assembly line and computerized supply
chains squelched the lively trade of cobblers,
tailors, and carpenters, but now a new type of artisan
is reversing the trend--the homebrew coder.
Programmers have long distributed their shareware via
dial-up bulletin boards, computer disks inside
magazines, or on the Internet--but now cheap Web
hosting, broadband Internet, and convenient online
payment services are helping more programmers ply
their trade independent of a large company. In
addition, the proliferation of programmable devices
such as smart phones is creating a number of unusual
market niches too esoteric for large firms to tackle:
Stockholm-based Salling Software founder Jonas
Salling, for instance, sells a software utility that
lets people with Sony Ericsson phones communicate with
Microsoft Entourage on a Macintosh computer; the phone
acts as a remote control via Bluetooth. Ranchero
Software founder Brent Simmons in Seattle sells a
program called NetNewsWire for the Mac OS X that
allows people to read news and then easily post
comments to their blog sites. Simmons says he enjoys
the creative freedom of developing his own solutions,
and earns about as much working from his garage as he
would in the corporate world. Nick Bradbury, founder
of Bradbury Software in Tennessee, created one of the
first Web-publishing tools and then sold it to
Allaire, now part of Macromedia; his company, which
now sells Web-page editor TopStyle and news-reading
application FeedDemon, provides him with much more
personal satisfaction than a regular company job, he
says. Gaurav Banga and Saurabh Aggarwbi in Sunnyvale,
Calif., run one of the most successful homebrew coding
operations with VeriChat, a subscription service that
lets smart phones and PDAs send and receive instant
messages.
From ACM
News, February 20, 2004
"Perl's Extreme Makeover" NewsFactor
Network (02/18/04); Ryan, Vincent
- Programmer Larry Wall and the Perl development
community are performing a dramatic facelift of the
Perl high-level programming language, known as Perl 6.
Perl Foundation President and Perl 6 core developer
team member Allison Randal characterizes Perl 6 as "a
complete rewrite of the internals of Perl and a
revision of the Perl syntax;" Monash University
associate professor Damian Conway says the Perl 6
development team jettisoned the old syntax and built a
cleaner syntax and more robust semantics from scratch.
"In every case, they're changes for the better:
changes that reduce the complexity of the language or
the awkwardness of a particular syntax, or that remove
an unnecessary limitation," Conway insists. Perl 6
also features embedded grammars, new control
structures, a macro language, a more advanced type
system, and a smaller core footprint for running on
personal digital assistants. Conway adds that the new
Parrot interpreter engine will support language
interoperability, while Parrot's lead designer Dan
Sugalski comments that Perl 6 is not the only language
the interpreter will run; he also notes that Parrot's
inclusion of a multi-platform, just-in-time compiler
offers a significant speed upgrade. Conway says Perl 6
will be able to switch to a Perl 5 interpreter running
atop Parrot when Perl 5 is identified in a program,
obviating the need to update existing Perl 5 scripts.
Sugalski predicts that Perl will become available in
larger systems, thanks to Parrot's improved
embedability and security features. However, Perl will
also maintain its value to diehard developer
enthusiasts that make up the backbone of its users;
Sugalski expresses a personal hope to embed Parrot in
games and office suites. Click
Here to View Full Article
"Se Habla Open Source?" CNet
(02/16/04); Becker, David
- Developing countries without basic desktop
software in native languages are turning to
open-source solutions, jeopardizing Microsoft's
long-term interests in those potential markets.
Proprietary software vendors usually wait until a
market is well-enough developed economically before
localizing their product, but open-source projects do
not have similar profit considerations. In Rwanda,
about 20 college students are translating 20,000 lines
of English script in OpenOffice into the native
dialect, Kinyarwanda; eventually, the volunteers see
computers bringing greater prosperity to their country
and want to provide their less-educated countrymen
with more convenient access to the technology. There
are other reasons for less-developed countries to
consider open-source software: In Slovenia, the
government and education institutions sponsored a team
of 10 translators to convert the OpenOffice
productivity suite into Slovenian, the rationale being
that open-source software would be more accessible to
local developers and save money over Microsoft's
Slovenian-version software. In Thailand, a
government-led effort to promote open source products
led Microsoft to roll out a localized, bare-bones
version of a combined Windows and Office package. In
India, the software giant has promised to make Office
2003 and Windows available in all 14 major Indian
dialects, while OpenOffice is already available in
five of those languages; previously, piracy threats
had delayed Microsoft investment in Asian markets in
particular. Analysts say open-source software is a new
factor in deciding when proprietary software enters a
market, less it find itself fighting a localized
open-source monopoly in the future. Click
Here to View Full Article
From ACM
News, February 18, 2004
"Biology Stirs Software 'Monoculture'
Debate" Associated Press (02/16/04); Pope,
Justin
- University of New Mexico biologist Stephanie
Forrest and Mike Reiter of Carnegie-Mellon University
have received a $750,000 National Science Foundation
grant to explore methods to automatically diversify
software code. The work stems from the belief that in
computer networks, just as in nature, species with
little variation, or "monocultures," are most
vulnerable to epidemics. Computer security specialist
Dan Geer lost his job when he published a paper last
fall suggesting that monoculture programs such as
Microsoft's software are so prevalent that one virus
could cause a great amount of damage. Microsoft
counters that computers and living organisms are only
similar in some ways. Even another major operating
system, such as Linux, would not keep out
sophisticated hackers, says Microsoft chief security
strategist Scott Charney. Last fall, Homeland Security
Department CIO Steven Cooper, responding to
questioning at a Congressional committee hearing, said
the federal government is concerned about its
vulnerability to monoculture and as a result would
increase its use of Linux and Unix. Geer says, "When
in doubt, I think of, `How does nature work?' Which
leads you...to think about monoculture, which leads
you to think about epidemic. Because the idea of an
epidemic is not radically different from what we're
talking about with the Internet." New York-Stony Brook
researchers R. Sekar and Daniel DuVarney are working
to diversify software by targeting the non-functional
parts of code using "benign mutations" to keep
software one step ahead of viruses. Click
Here to View Full Article
"Search for Tomorrow" Washington
Post (02/15/04) P. D1; Achenbach, Joel
- Google has established itself as the first
Internet search engine to achieve utility-like status,
with the service handling more than 200 million
queries daily; however, next-generation search engines
are likely to make Google seem medieval in comparison.
Google's worth as a navigation tool has expanded along
with the amount of valuable data on the Internet, and
e-book author Seth Godin speculates that 2000 was the
year that valuable online content reached a critical
mass. Google's emphasis on practicality--getting
quick, accurate search results through parallel
computing--over bells and whistles is a key factor in
its success. Google resembles other search engines in
that it employs "crawler" programs that automatically
troll the Web, clicking on all possible links; but it
also notes how many other Web pages link to any given
page, which determine how high specific pages are
ranked by Google. Its success in this area signifies
the next logical evolutionary step for search engines:
The emergence of "intelligent agents" that personalize
searches by studying individual users' search patterns
and habits. IBM's Dave Gruhl calls this process user
modeling, in which computers deduce users' interests
and preferences by analyzing interactions between
people. Godin contends that what people want in an
intelligent agent is a "find engine" rather than a
search engine, a digital secretary or helper that
anticipates the information a person wants. Such a
tool will need to better understand information on the
Web, rather than base searches on the presence of
keywords--and for this to happen, Web sites will need
to be tagged with metadata, which is the goal of the
Semantic Web project. Furthermore, future search
engines must be able to carry out searches on all
kinds of digital content--films, music, etc.--not just
text documents. Click
Here to View Full Article
From ACM
News, February 13, 2004
"Software Bug Blamed for Blackout Alarm
Failure" Associated Press (02/12/04);
Jesdanun, Anick
- A Feb. 12 statement from industry officials
attributes alarm failures that may have exacerbated
last summer's Northeast power outage to a software
glitch in the FirstEnergy infrastructure. A joint
U.S.-Canadian task force probing the blackout reported
in November 2003 that FirstEnergy staffers did not
take action that could have contained utility
malfunctions because their data-monitoring and alarm
computers were inoperative. FirstEnergy's Ralph
DiNicola says a software bug was determined to be the
cause of the trouble by late October, and insists that
the utility has since deployed fixes developed by
General Electric, which worked in conjunction with
Kema and FirstEnergy to trace the programming error.
DiNicola says the malfunctions transpired when
multiple systems attempting to access the same data
simultaneously were put on hold; the software should
have given priority to one system. This led to the
retention of data that should have been deleted, which
resulted in performance slowdowns, and the backup
systems were similarly afflicted. Kema's Joseph
Bucciero says the software glitch arose because many
abnormal incidents were taking place at the same time.
Finding the error involved sifting through a massive
amount of code, a process that took weeks, according
to DiNicola. Click
Here to View Full Article
From ACM
News, February 11, 2004
"Save Your Software!
Recycle" InternetNews.com (02/06/04);
Boulton, Clint
- Grant Larsen, an engineer at IBM's Rational
software development tools division, believes the
Re-usable Asset Specification (RAS) for recycling
software code is edging toward standardization. RAS,
which is expected to be ratified in July following its
review by the Object Management Group, will save time
and money on software development and allow engineers
to devote themselves to other important projects. The
specification is designed to take bits of software
code and piece them together to form assets, which can
then be combined into more sophisticated assets for a
"coarse-grained solution," explains Larsen, who also
heads the RAS consortium, which counts IBM,
ComponentSource, and Borland among its members. He
says, "Just like the steering wheel, turn signals and
pedals of a car are slightly different across car
models and makes, software assets may be slightly
different, but will all have an inherent familiarity."
Harnessing this familiarity is key to recycling
software code. Bundling the code together with code
from other systems involves a process of modeling,
visualization, testing, and validation. The RAS
process could be used, for example, to take code from
a Web site and build a better Web site from it. Larsen
notes that many software developers can avail
themselves of RAS because it is written in XML;
furthermore, the specification models predictability,
while its use of metadata helps in the profile
description process. Click
Here to View Full Article
From ACM
News, December 29, 2003
"The Flexible Factory" Software
Development (12/03) Vol. 11, No. 12, P. 30;
Szyperski, Clemens; Messerschmitt, David G.
- Software creation generally falls into one of
three development strategies--creating code from
scratch, software reuse, and component assembly; the
third option offers the most promise by creating, in
essence, a software supply chain that improves
quality-cost options in keeping with the accompanying
shift of competition. Software components must possess
five primary qualities: They must be multi-use,
non-context-specific, composable with other
components, impervious to modification (encapsulated),
and units of independent implementation and
versioning. Though components are more expensive to
build and sustain than handcrafted or reusable
modules, there are clear benefits, including
minimization of defects and improved quality,
application of component upgrades to multiple users
even though the upgrade primarily serves one user, and
a foundation for more flexible systems that can change
to accommodate fluctuating demands. Economically,
components are more likely to be bought outside, and
offer more potential to boost software productivity
than reuse. The similarity of software components to
standard reusable parts could ignite the software
equivalent of an industrial revolution, once certain
problems are overcome. Challenges that need to be met
include a movement away from the incorrect analogy
between a software program and a hardware product;
programs should not be through of as hardware products
so much as flexible factories for such products. A
rich and abundant marketplace for software components
is gaining momentum, but is still in an early stage of
development. Finally, an industrial software
revolution has a better chance of taking place if
conventional warranty, liability, and insurance
policies are overhauled to handle software's quirks.
Click
Here to View Full Article
From ACM
News, December 17, 2003
"XML--Rodney, Are We There
Yet?" CNet (12/14/03); Ruh, William
- The dot-com bust left XML with few ardent
supporters and little respect; however, XML's fortunes
have now turned as major software vendors have either
already included support for the standard or plan to
do so in their next upgrades. Though XML began as a
simple improvement on HTML, it now is the cornerstone
of many companies' business strategies and is even
reshaping the way industries do business. In health
care and insurance, the HL7 and ACCORD standards,
respectively, are representative of how XML provides
the basis for industry data exchange standards. The
Chief Information Officers Council has designated XML
as one of the technological foundations for improving
federal government operations. XML's popularity comes
as organizations begin to recognize the importance of
data as the lifeblood of the extended enterprise and
for boosting productivity within the company.
Executive management dashboards and other enterprise
information applications are high on the list of
executive priorities today; XML is well able to meet
these demands, especially since it provides the
semistructured type of data businesses need to extract
from forms and documents. XML is also not overburdened
by extraneous services requirements, similar to other
widely adopted technologies such as Ethernet and
Internet Protocol. From a development standpoint, XML
is also easy to learn and work with. Still, some
challenges lie ahead for XML, including implementation
of existing digital signature and encryption standards
for better access control and security; also, XML
involves significant bandwidth requirements, though
these are no more than comparable technologies, such
as Adobe Systems' Portable Document Format. Click
Here to View Full Article
From ACM
News, December 17, 2003
"Scientific Research Backs Wisdom of Open
Source" NewsFactor Network (12/15/03);
Martin, Mike
- Researchers at the National Science Foundation
(NSF) and the University of California, Irvine, are
united in their opinion that open-source software
development is far more advantageous than closed
corporate development in terms of quality, speed, and
cost. "Free and open-source software development is
faster, better, and cheaper in building a community
and at reinforcing and institutionalizing a culture
for how to develop software," declares Walt Scacchi of
UC Irvine's Institute for Software Research, who has
presented a series of online reports detailing the
development model's advantages in the hopes that
businesses can understand the implications of making
internal or external investments in open source.
"We're not ready to assert that open-source
development is the be-all and end-all for software
engineering practice, but there's something going on
in open-source development that is different from what
we see in the textbooks," he adds. Scacchi notes that
most open-source development projects fail, and the
study of Linux and other successful open-source
projects makes a clear contribution to open-source
research. David Hart of the NSF reports that Scacchi,
along with Santa Clara University's John Noll, the
University of Illinois' Les Gasser, and UC Irvine
colleague Richard Taylor, are taking the lessons
learned from open-source practices and applying them
to devise new design, knowledge-management, and
process-management tools for development projects
spanning across multiple organizations. Gasser and
Scacchi, for instance, are drawing insights about the
relationship between bug disclosures and software
quality by mining open-source project databases, which
Scacchi describes as a cherished resource not found in
corporate communities. Scacchi and colleagues are
studying over 100 open-source projects covering
categories such as network games, industry-backed
projects such as IBM's Eclipse, and Web infrastructure
projects. Click
Here to View Full Article
From ACM
News, December 15, 2003
"How Will Web Services Ultimately Change
Business?" CIO (12/01/03) Vol. 17, No. 5, P.
108; Jahnke, Art
- Web services will not only allow computer systems
to share a common language, but also enable them to
create solutions independent of humans: This ability
promises to change how companies operate in ways
unforeseen today, almost certainly affecting the role
of the human technology worker. When first developed,
the radio was mostly seen as a way for ship captains
to communicate with on-shore colleagues, notes
University of California at Berkeley economics
professor Hal Varian, one of the speakers discussing
Web services at the Symposium on the Coevolution of
Technology-Business Innovations held at IBM's Almaden
Research Center this fall. Today, radio is not seen as
a niche application, but a technology with
wide-ranging social and technological effects. Web
services technology promises similar change: In
businesses, Web services is certain to take over some
decision-making tasks of humans, but artificial
intelligence will not be able to understand how to
create new markets or customer desire until computers
learn to think like humans, says Eastern Municipal
Water District CIO Daniel C. Ashley. Web services are
similarly unsuitable for the financial world, where
in-house online transaction systems have proven far
more responsive, says WorldGroup Consulting's Rodney
Griffin. Though Web services may provide peripheral
application functions, they will not replace
companies' core applications where performance is key.
Eighty percent of application maintenance labor
involves specification, problem determination, and
human interaction--areas that computer-driven Web
services cannot compete, adds College of the Canyons
associate professor Dean Cashley. Those functions will
remain the domain of human IT workers, while Web
services will take over many purely technical computer
programming tasks and be incorporated in commercial
enterprise products. Click
Here to View Full Article
From ACM
News, December 24, 2003
"Software Glitch Brings Y2K Deja
Vu" CNet (12/19/03); Becker, David
- Echoes of the Y2K bug are reverberating as
software maker PTC attempts to correct a date-related
glitch that threatens to render software on thousands
of computers around the world inoperative after Jan.
10. PTC's Joe Gavaghan reports that PTC programmers
had to set a date for infinity so that the software
could recognize dates, and chose 2 billion seconds
since 1970, which means the date recognition function
will fail after Jan. 10 and make the software
inoperable. Gavaghan says the bug was uncovered last
week, and PTC engineers have been scrambling to build
and test patches since; two patches that apply to some
of PTC's most widely distributed products were issued
on Dec. 19, and fixes for other applications are
forthcoming. The glitch threatens to affect the
majority of 35,000 people who employ Pro/Engineer,
Windchill, and other PTC products worldwide. "It's
such a simple flaw; we don't believe it requires
extensive testing to deploy the patches," explains
Gavaghan. "It should take only a couple of minutes for
most customers." Some customers praised PTC for its
honesty and promptness in notifying them of the
problem, while others were less than thrilled with the
news, as patch testing and installation would disrupt
their holiday schedules. Gavaghan says the infinity
value will be set to 4 billion with the patches, and
promises that later releases of PTC products will not
be dependent on dates. Click
Here to View Full Article
From Business
Week, December 2, 2003: U.S. Programmers at
Overseas Salaries, by David E. Gumpert. Read
Article
From ACM
News, November 26, 2003
"Java Toolmakers Work for
Peace" InternetNews.com (11/25/03); Singer,
Michael
- A new group of software firms that use Java
heavily are calling for a unified application
framework that would allow Java tool extensions to be
written once for all standards-based Java integrated
development environments (IDEs). Third-party and
independent software developers would be able to
easily ensure interoperability with the major vendors'
development tools, speeding innovation in the
application development industry, says Gartner analyst
Mark Driver. The effort, tentatively called Java Tools
Community, is comprised of SAP, BEA Systems,
Compuware, Sybase, and is headed by Sun and Oracle.
The group is in strategic talks with IBM and Borland.
Currently, open source and independent developers are
forced to write tool extensions for each vendor IDE,
such as Oracle's JDeveloper, the Eclipse/WebSphere
Studio Application Developer, SunOne Studio/NetBeans,
and Borland's JBuilder. The effort is an outgrowth of
Java Service Request (JSR) 198, which established a
common application interface for Java IDE extensions.
The Java Tools Community needs to establish a scope of
responsibility and work out a formal structure for
collaboration, according to one Sun official. The
program is similar to IBM's Eclipse project meant to
create a general-purpose IDE. Eclipse recently won
further support from Oracle, which joined the Eclipse
board to ensure interoperability with its own
development platform; in doing so, Oracle acknowledged
the growing influence of Eclipse and the necessity of
providing easy links to its own Oracle9i Application
Server and Oracle9i database. Click
Here to View Full Article
From ACM
News, November 19, 2003
"The Programmer's
Future" InformationWeek (11/17/03) No. 964,
P. 40; Chabrow, Eric; Murphy, Chris
- The growth of cheap overseas competition and
packaged applications are reducing the job
opportunities for corporate programmers in the United
States, giving rise to a new worker model that
stresses productivity and business acumen over pure
technical skills. In fact, Owens & Minor CIO David
Guzman reports a significant decline in the number of
IT professionals who regard themselves as programmers,
given the negative connotations the term has acquired
recently. Archipelago programmer Kevin Mueller
believes savvy companies will adopt a strategy in
which small teams of programmers familiar with
business goals collaborate with business managers to
create problem-solving software, but he notes that
many companies will choose to deal with problems
through brute programming force, which is most likely
to be outsourced in the near-term. Lower programmer
prospects are being driven not only by burgeoning
overseas outsourcing, but by less need for programmers
overall. The move to object-oriented programming and
packaged apps has shifted a great portion of
business-process knowledge to business analysts, while
Archipelago CTO Steven Rubinow says programmers with
the best career prospects are those who receive the
best training and are the most productive. He predicts
that "the lower echelons of the skill levels are going
to be washed away." Booz Allen Hamilton CIO George
Tillman says that future CIOs are more likely to hail
from the company's business unit, while Mueller says
the insular nature of programming puts programmers at
a disadvantage by cutting them off from critical
contacts and customers. Intentional Software owner
Charles Simonyi thinks that 20 percent or more of the
least productive American programmers could be
outsourced in the near future, but their outsourced
jobs will eventually be mechanized by tapping the
expertise of senior U.S. programmers. Click
Here to View Full Article
From ACM
News, November 12, 2003
"Everyone's a Programmer" Technology
Review (11/03) Vol. 106, No. 9, P. 34; Tristram,
Claire
- Intentional Software co-founder and billionaire
Charles Simonyi proposes a radical rethinking of
software, which has become too complex to
understand--a disadvantage that has led to unfixable
software failures, abandoned projects, and tens of
billions of lost dollars. Simonyi believes the
solution lies in a simple yet powerful coding
methodology both programmers and users can understand.
The first step to accomplishing this is to examine the
flaws in existing programming practice, and Simonyi
thinks most failed software projects stem from
developers who must do three jobs simultaneously: They
must comprehend the clients' needs, which are often
complicated; they must convert these requirements into
computer readable algorithms and interfaces; and they
must produce flawless, bug-free code with machine-like
precision--an impossibility, given that human beings
are inherently error-prone. Simonyi seeks to strip
down the first two tasks to the bare essentials and
eliminate the third task altogether by automating
programming's drone-like elements and creating an
intuitive programming interface, thus leaving
programmers free to focus on program design. Key to
this is the development of a software "generator"
instructed through a simple interface or "modeling
language;" Simonyi's goal is to tightly integrate the
model and the programming so that users and
programmers can shift between many different
perspectives of the models and change the programs as
they wish through slight modifications. The core
infrastructure of intentional programming is
aspect-oriented programming, a method that lets
programmers quickly alter all instances of related
commands. Simonyi says the new program generator model
resembles a PowerPoint interface, with which people
can compose presentation slides by pasting text,
images, or charts into specific areas of an intuitive
virtual environment. Simonyi's colleagues and rivals
are taking different approaches to software modeling:
James Gosling of Sun Labs supports a technique to plug
existing code into a graphical modeling interface.
Simonyi predicts that Intentional Software will not
roll out a commercial intentional programming product
for at least two years, but once such products are on
the market, programmers will be able to build far more
complex programs than can be made using current
techniques. Click
Here to View Full Article
From ACM
News, October 29, 2003
"The Future of Software
Bugs" Computerworld (10/27/03) Vol. 31, No.
49, P. 32; Thibodeau, Patrick
- The continuing threat of software bugs stems from
a variety of factors, including software vendors and
in-house development teams that rush testing and
sacrifice quality so they can rapidly move products to
market; academic computer science programs that place
more emphasis on development than testing; and
legislation that absolves developers of the blame for
damages users suffer as a result of faulty products. A
major problem is the fact that "Most software projects
have not been designed for quality," according to Herb
Krasner, director of the University of Texas at
Austin's Software Quality Institute. A long-term
initiative, partly spurred by the software glitch that
led to the destruction of the Mars Polar Lander in
1999, aims to define software quality standards that
encompass the properties of reliability, usability,
efficiency, maintainability, and portability. The
development of such standards is one of the missions
of Carnegie Mellon University's Sustainable Computing
Consortium (SCC), and SCC director William Guttman
says the measurement of security, dependability, and
other traits will enable users to make quality-based
software purchases. Meanwhile, a project underway at
MIT is concentrating on the development of automated
software testing processes through the creation of
algorithms for generating "inputs," or software
instructions. Florida Institute of Technology computer
science professor Cem Kaner chiefly blames the dearth
of quality software on a lack of legal liability among
vendors for defective products, an issue that has
become especially urgent in light of highly damaging
virus outbreaks that target flawed software. The
National Institute of Standards and Technology
reported in 2002 that users and vendors spend $60
billion annually because of insufficient software
testing. Click
Here to View Full Article
"Commentary: Business Software Needs a Revolution," It's
too complicated. It's too expensive. That's why it's
change-or-die time. Read
the article from Business Week
The
"Software Pioneers Conference" which was held in Bonn,
Germany, in June 28-29, 2001, featured some of the top
pioneers in the area of Software and Information Systems
and attended by over 1000 software professionals. The
presentations by the software pioneers provide rich
reading material for those teaching Software
Engineering, Software Design, Databases, Systems
Analysis & Design, etc. The presentations (video and
PDF files) are now available to you at http://www.sdm.de/conf2001/index_e.htm.
From ACM
News, October 3, 2003
"Developers Blaze Their Own
Trail" InfoWorld (09/29/03) Vol. 25, No. 38,
P. 36; Knorr, Eric
- The 2003 InfoWorld Programming Survey of 804
programmers and their managers concludes that Web
applications dominate industry, even though Microsoft
and others insist that developers should switch to
fast desktop clients: 80 percent of respondents report
that such apps are a key component of their server
development, while 53 percent say they favor apps with
a Web-style user interface. Moreover, a significant
percentage of programmers, in defiance of assertions
that Microsoft .Net and J2EE rule the programming
roost, prefer to base their Web apps on simple
languages such as Perl, JavaScript, and VBScript.
Fifty-one percent of those polled note that their
server development includes Web services, and 52
percent are using XML or object-oriented databases.
Certain IT shops have made it a standard practice to
use the tools, languages, and framework for
mission-critical apps for departmental apps designed
to increase productivity and satisfy short-term
business demands, but the survey shows that others
accept a certain loss of control by using tools and
techniques more tightly aligned to the job at hand.
Most respondents indicate that the biggest barriers to
software reuse are the effort required to design
reusable software or a lack of awareness of available
software; 44 percent declare themselves satisfied with
current resuage levels while 41 percent express
dissatisfaction. Sixty-nine percent call "shared
libraries" highly reusable, while 42 percent classify
"components" and 21 percent list dynamic language as
highly reusable. "People build applications using
scripting tools because the tools are so easy to get
going and because they deliver functionality to
business users so fast," notes an anonymous
consultant. "The downside is that they do not require
the process and discipline of the more robust
applications, so maintenance tends to be very
difficult if not impossible." Click
Here to View Full Article
From ACM
News, September 15, 2003
"Worth Its SALT" EDN Magazine
(09/10/03); Potter, Stephen
- The Speech Application Language Tags (SALT) 1.0
specification is now available free of royalties, and
has been contributed to the World Wide Web Consortium
(W3C). Developed by the SALT Forum, which is comprised
of more than 70 companies interested in speech and Web
applications that combine voice interaction with
conventional interface modes, SALT is a speech-markup
language that promises to give users a greater level
of interaction with electronic devices. Indeed,
multimodal applications offer new interactive
possibilities for users of electronic devices that
take advantage of SALT. A SALT speech interface would
allows users to speak directly into a personal digital
assistant; would allow "hands-free" and "eyes-free"
interactions with devices in mobile environments such
as a warehouse or when driving; and allow screen
reading by voice, surfing by voice, rapid data entry
by voice, and point-click-and-speak features. Users
would be able to ask a map, "How do I get from here to
there?" HTML, XHTML, WML, SMIL and other current and
future Web standards will be able to use SALT. Various
companies are now developing SALT-enabled browsers for
various platforms, both multimodal and telephony, as
speech and multimodal capability is expected to make
for more natural interaction with the Web. Click
Here to View Full Article
From ACM
News, September 3, 2003
"Corporate Data in Hand" InfoWorld
(08/25/03) Vol. 25, No. 33, P. 42; Thompson, Tom
- The J2ME platform--a streamlined version of
Java--supports the widest spectrum of embedded and
mobile devices and is highly secure, but there are
tradeoffs. J2ME applications can run on any embedded
device with a Java runtime with little if any
alterations, but differences in Java runtime
deployments can lead to compatibility difficulties.
Developers can generate custom applications with a
minimum of effort by taking advantage of the
platform's application programming interfaces (APIs),
but they cannot use features that do not have an
available API, leaving them with little choice but to
use vendor-specific APIs. J2ME APIs offer available
and easily accessible network support through wired or
wireless links; however, J2ME's Mobile Information
Device Profile (MIDP) does not support data transfers
via any protocol other than HTTP, thus forcing
developers to rely once again on vendor-specific APIs.
Programmers can test code prior to trying it out on
scanty hardware, but compatibility or UI decision
problems may stem from discontinuities between the PC
simulator and the device's J2ME deployment. J2ME sets
up equality for all developers and vendors and
delivers best-of-breed software frameworks, but the
standards approval process is slow. Such issues are
being looked at: Last November, the Java Community
Process (JCP) revised and enhanced the MIDP standard
with various critical APIs that support secure
HTTP-based links; the amended specification also
allows trusted code to be supported with digital
signatures. The JCP has additionally proposed a Java
Technology for the Wireless Industry spec that
converts currently optional J2ME APIs into standard
services.
From ACM
News, August 29, 2003
"Dumb Software For Dumb
People" Salon.com (08/27/03); Manjoo, Farhad
- Experts argue that computer viruses, particularly
those that have wrought mischief in the last month,
throw into sharp relief Microsoft's flawed software
development model, which focuses on adding needless
complexity to systems, integrating applications too
tightly, and keeping security add-ons turned off by
default. However, experts accuse computer users of
being partially responsible: Packettattack.com's Mike
Sweeney says that most users regard beefing up their
software as an inconvenience, and this attitude only
encourages software companies to consistently turn out
shoddy products. Microsoft responded to the more
recent virus outbreaks with newspaper ads advising
users to add Internet firewalls, deploy the company's
latest security patches, and use an updated antivirus
program. Security researcher Richard Smith calls this
strategy "blaming the victim." He is also puzzled that
Microsoft, with its vast financial resources, has been
unable to eliminate buffer overflows that worms such
as Blaster have been exploiting, and which are
relatively easy to detect. Microsoft CEO Bill Gates'
2002 memo calling for more "trustworthy" software
appears to have had some effect--Smith notes that the
latest version of Outlook is set by default to keep
users from loading executable email attachments that
viruses often piggyback on, but Microsoft still is not
deploying across-the-board security solutions. Steve
Lipner of Microsoft insists that his company's
security initiatives are driven by customer demand,
though Counterpane Internet Security founder Bruce
Schneier contends that security will not improve until
companies become liable for the damage users suffer
because their software is insecure by design.
From ACM
News, August 8, 2003
"XML: Extremely Critical or Exhaustingly
Complex?" ZDNet UK (08/05/03); Donoghue,
Andrew
- Despite its tremendous popularity, XML deployment
has become even more controversial with its
proliferation, since there is no central authority
governing XML standards. Gartner research director
Charles Abrams says the XML meta-language is as
important as was the World Wide Web or client/server
computing. However, the spread of XML standards--with
no one keeping track of exactly how many, Abrams says
there are hundreds--limits the extent to which
companies should commit business-critical operations
to the emergent software technology. A 2002 Gartner
report warned against rapid, extensive XML deployment,
stating that incongruous standards could lead to
wasted effort or even compromised business-critical
transactions. The World Wide Web Consortium (W3C) that
vetted XML does not govern how companies use it,
leaving even single companies to craft their own set
of XML tags. Significantly, however, several large XML
efforts have involved major standards bodies,
including the ebXML standard developed by the
Organization for the advancement of Structured
Information standards (Oasis) and an U.N.-related
agency. McKinsey & Company has been similarly
circumspect in its support of XML, advising businesses
to extend existing EDI infrastructures with XML rather
than replace them wholesale. By wrapping messages in
the Simple Object Access Protocol (SOAP) data exchange
standard, Abrams says companies can turn hybrid XML
and EDI exchanges into basic Web services components.
Abrams notes that XML development will mirror previous
important technology rollouts, where firms initially
overestimate its usefulness before underestimating the
technology later. Click
Here to View Full Article
From ACM
News, August 6, 2003
"Free Software Faces a Rocky Road to
Court" Financial Times (08/06/03) P. 8;
Foremski, Tom
- Hanging over the LinuxWorld trade show this week
in San Francisco is the SCO lawsuit against IBM over
Unix intellectual property infringement that not only
threatens the future of Linux, but potentially a host
of commercial software as well. SCO says IBM copied
key enterprise computing features from Unix to Linux
in order to make it ready for business-critical
applications. Proving otherwise could be difficult,
given that common programming practice today involves
software engineers with areas of expertise to work on
similar projects. When those programmers migrate to
other companies, they take with them knowledge of
intellectual property that could work its way into new
products. Linux leader Linus Torvalds said there are
strict controls protecting against the illegal
inclusion of intellectual property, though he admits
they are difficult to enforce with thousands of
volunteer programmers worldwide. Microsoft Chairman
Bill Gates said open-source software probably
infringed on a wide range of intellectual property,
especially when the project focuses on emulating
commercial programs. Countering that claim, Torvalds
said Microsoft software most likely contained some
Linux intellectual property as well. Industry analysts
see Microsoft and Sun as bankrolling the SCO courtroom
crusade, given that both companies recently signed
multimillion-dollar Unix licensing agreements with
SCO. Sun notes that its unique license arrangement,
agreed to by former Unix owner Novell, protects Linux
users who are also Sun customers; however, one Linux
programmer says the open source community will simply
circumvent the entire infringement program technically
once SCO's claims are made clear. http://search.ft.com/search/article.0html?id=030805005498
From ACM
News, August 1, 2003
"Code Reuse Gets
Easier" Computerworld (07/28/03) Vol. 31,
No. 36, P. 24; Anthes, Gary H.
- Software reuse was touted heavily in the 1980s,
but widescale adoption remained elusive until
object-oriented languages and applications emerged;
reusing code has been simplified even further with the
advent of XML-based Web services, Universal
Description, Discovery, and Integration directories,
and the J2EE and .Net component models. The power and
compatibility of software tools has also been enhanced
with the appearance of Unified Modeling Language for
object-oriented software management and the Reusable
Asset Specification. Software reuse is not as new as
it may seem, but the practice has long been
complicated by a dearth of protocols, policies, and
tools for tracking, coordinating, searching, and
circulating software assets. "Developers like to share
things informally, and managers might be surprised to
find how much reuse they already have," explains
Fidelity National Financial's Dale Hite. "The leverage
comes from being able to manage where it's at,
locating it, updating it and maintaining it once
versus maintaining it in a number of iterations." At
the core of software reuse are searchable repositories
of software metadata and usage, but there are also
development tools and environments, version-control
software, legacy code wrapping and transformation
tools, and messaging tools that support reuse. Grant
Larsen of IBM's Rational Software division notes that
companies are seeing the wisdom in reusing precode
assets such as design specifications and requirements.
Other assets that could also be considered reusable
include best practices, business-process protocols,
test cases, interface specs, documentation, images,
models, patterns, and XML architecture. Andrew
Zimmerman of Citigroup Real Estate Servicing and
Technology says that developing code with reuse in
mind takes longer, but promises to dramatically reduce
the time and effort of subsequent code rollouts. Click
Here to View Article
From ACM
News, July 7, 2003
"Another Digit, Another
Deadline" Computerworld (06/30/03) Vol. 31,
No. 32, P. 35; Melymuka, Kathleen
- U.S. retailers are facing a deadline reminiscent
of Y2K in terms of the work required, though the
consequences for missing the deadline will not cause
systems to crash. The Sunrise 2005 deadline was issued
by the Uniform Code Council (UCC) in 1997 and expands
the current 12-digit universal product code (UPC) used
in the United States to 13 digits, though the UCC
recommends 14-digit compliance to accommodate reduced
space symbology and radio-frequency identification.
Overseas manufacturers and retailers already use
13-digit product codes, and major U.S. retailers that
deal in foreign goods have already begun remediating
their systems. Without remediation, retailers' systems
will not be able to read the new 13-digit UPCs into
their back-end systems, though scanners used at
point-of-sale terminals can already read 13-digit
UPCs. The grocery retail sector is seen as the laggard
in Sunrise 2005 compliance, especially small,
independent operators without the resources to revamp
their systems. Ahold chief U.S. technology officer Ed
Gropp says conversion is similar to Y2K preparations
in that IT staff have to hunt down scattered 12-digit
fields in databases and other systems. However,
because 12-digit UPCs are simply numbers and not a
date field, they are even more difficult to identify.
In addition, many retailers had parsed the 12-digit
UPC to derive the vendor's identification, though the
UCC did not officially support the practice. These
systems will also be thrown off by Sunrise 2005, since
the new 13-digit UPC will use vendor numbers up to 10
digits in length instead of the consistent six-digit
vendor prefix used in the 12-digit UPC. Click
Here to View Full Article
From ACM
News, June 23, 2003
"Building a Better
Bug-Trap" Economist (06/21/03) Vol. 367, No.
8328, P. T15
- The significance and pervasiveness of programming
errors are growing as software becomes more deeply
integrated and embedded within society, which in turn
makes traditional bug-finding methods less effective.
Software that can detect bugs early in the development
process is gaining more credence as a result, and such
software is often derived from research into "formal
methods" designed to analyze programs and confirm that
they are performing correctly. One formal technique
involves mathematically describing a program's
appropriate behavior and comparing it to the way it
actually behaves, an arduous procedure that can be
bypassed by concentrating exclusively on a description
of inappropriate behavior and looking for matches.
Another bug-finding method is to draw comparisons
between an old program that works properly and an
upgraded version of that program; Microsoft Research's
Amitabh Srivastaya notes that programmers may have
difficulty knowing which test scripts to run, so he
has devised Scout, a system that employs "binary
matching" to juxtapose the programs, find differing
bits, and assign applicable test scripts. A
"high-level" model can be derived from a program's
code and contrasted with a similar model extracted
from an altered version of the program to check for
the presence of both new and existing bugs. The
technique can also be applied in reverse to a certain
degree, using methodology such as the notation of
unified modeling language co-developed by Rational's
Grady Booch. Another method involves comparing a model
derived from a modified piece of code with a model
derived from the program's design specifications, a
technique formulated to detect abnormalities that
could give rise to errors. All of these options can
increase the predictability of the software
development process and accelerate the detection of
unanticipated problems and delays, but a potential
drawback is the risk of estranging programmers by
comparing their performances. Click
Here to View Full Article
From ACM
News, June 13, 2003
"Staying Up All Night on Java" Wired
News (06/13/03); Batista, Elisa
- Some analysts claim that enthusiasm toward Sun
Microsystems' Java programming language approaches the
status of a cult or religion. Programmers appreciate
Java's versatility: The language is compatible with
all operating systems and works across mobile phones,
PCs, servers, and many other devices. Java has also
become a symbol of many programmers' derision for
Microsoft's attempt to dominate the software industry,
as well as the general unreliability of Microsoft
software. "If you have a problem and want to write a
piece of software to fix that problem...in C++ you
will run into a lot of technical difficulties," notes
software engineer Nathaniel Baughman. "Java almost
protects you from tripping yourself up." An
independent observer remarks that mobilizing around
Java is considered to be taking a stand against an
"evil" corporate force. Hardcore Java enthusiasts
convene at the annual JavaOne conference in San
Francisco, where Java store manager Jim Childers says
some attendees typically spend $500 to $600 on the
latest Java merchandise. Several JavaOne engineers
derided Microsoft for refusing to embed Java in its
Windows operating system. The number of Java
developers currently totals 3 million. http://www.wired.com/news/culture/0,1284,59225,00.html
From ACM
News, May 21, 2003
"Bugged Out" Salon.com (05/16/03);
Rosenberg, Scott
- Ellen Ullman, author of "Close to the Machine:
Technophilia and Its Discontents," drew upon real-life
experience for her new novel "The Bug," a parable
about a computer programmer confronted with a bug that
thwarts all attempts to lock it down. The basis for
the novel was a resilient bug Ullman encountered as a
programmer for Sybase, which she originally thought
could be included in an essay designed to help people
unfamiliar with coding understand the debugging
process; later on, she found it much more appealing to
turn the essay into "a historical, technical, Gothic
mystery." Ullman decided to set the novel in a
technical environment that no longer existed, in which
programmers had to write everything themselves rather
than employ pre-written layers of code, as is done
today. The author explains that she wanted to include
two opposing themes in the book: The idea that
technology, no matter how complex, can be understood
when deconstructed into its basic components, while
the integration of those components is not necessarily
understandable. Ullman says it becomes harder for
programming to qualify as a science when engineers
move away from hardware and software and toward
applications, and supports this idea by noting that
the advancement of computer capability has not been
accompanied by a similar advance in software writing
ability. Ullman admits that the novel's protagonist
fits the stereotypical mold of programmers being more
technically adroit than socially inclined, but she
observes that programming now involves more social
interaction thanks to the open-software movement.
Ullman laments the technology recession, which has
left a lot of talented programmers unemployed,
resulting in a loss of "institutional memory." She
also finds it disturbing that technology is moving
into surveillance systems due to political pressures.
Click
Here to View Full Article
From ACM
News, May 21, 2003
"The Crisis of Computing's Dying
Breed" Financial Times (05/21/03) P. 11;
Foremski, Tom
IT workers knowledgeable in mainframe
operations are a dying breed, although the hardware
they run has proven surprisingly resilient to
extinction. IT pundits had predicted server systems
would make the mainframe obsolete, but many companies
are loath to abandon the security, reliability, and
relatively low maintenance costs of their mainframes.
Today, however, university graduates are much more
likely to educate themselves in skills such as Web
services and other, more flexible programming
languages than mainframe Cobol code. Experts say
organizations are facing pressure to find adequately
trained staff to maintain their mainframes, but IBM
computer hardware group head Bill Zeitler says his
company's improvements to the mainframe add viability
to the platform. The recently released and most
powerful IBM mainframe yet, dubbed T-Rex, is capable
of handling entire e-business operations. T-Rex can
replace multiple mainframe machines and help
consolidate dwindling ranks of mainframe
administrators. In addition, IBM is pushing the Linux
operating system, which gives newer IT workers a foot
in the door in terms of mainframe operation, according
to Sageza Group research director Charles King. Still,
professionals with mainframe experience are much older
than the norm, a situation that's a growing concern
for companies. Over half of IT workers with mainframe
experience were over 50 in a Meta Group survey last
year, while less than 10 percent of workers with
Windows NT and Unix skills were that age. Gartner's
Diane Morello warns that few companies have planned
for the day when they are forced to change their
technology platform due to a lack of skilled workers.
Zeitler notes that an estimated 60 percent of
corporate data is on mainframes, and expects that
companies will move very slowly in their shift away
from the platform. http://search.ft.com/search/article.html?id=030521001068
From Peter
Coffee's e-Letters, May 19, 2003
Is COBOL the 18-Wheeler of the Web? Legacy
code may be stuck at the truck stop if tomorrow's
coders choose other rides.
- If you're looking for a hot combination of highly
employable skills, consider writing code to provide
Web services--in COBOL.
- Can a person build a 21st-century IT career on
this 1960s foundation? Well, foundations are better
than shifting sands. Legacy Reserves, a databank for
over-35 IT pros, cites Gartner estimates that
retirement and death will shrink the population of
working COBOL coders by 13 percent between 2002 and
2006, even while 15 percent of all new applications
are being written in the language--and quotes the GIGA
Group as predicting that "The most highly paid
programmers in the next ten years are going to be
COBOL programmers who know the Internet."
- Read about the origins of COBOL: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3628-1
- Read more COBOL facts: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3631-1
- It doesn't matter whether you enter the world of
Web services through the door marked ".Net" or the one
labeled "Java" (and after all, the whole point is to
avoid being locked into either one). Fujitsu's
NetCOBOL for .Net produces Common Language Runtime
code that integrates with Microsoft languages like C#
and Visual Basic .Net; Micro Focus plans to do the
same by summer. Fujitsu has also commissioned a
multimedia training course, with textbook, for
developers who want to learn more.
- Find out more about Fujitsu's NetCOBOL for .Net:
http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3634-1
- Check out the .NET Framework Developer's Guide to
Common Language Runtime: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3637-1
- Read "COBOL on .Net Next for Micro Focus": http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3640-1
- Find out more about Fujitsu's training course: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3643-1
- Meanwhile, at GigaWorld IT Forum last week in
Phoenix, Micro Focus rolled out its Enterprise Server
platform for COBOL/J2EE integration. LegacyJ Corp.'s
PERCobol takes another road, compiling 15 dialects of
COBOL source code to Java Virtual Machine executables.
And earlier this month, Acucorp announced forthcoming
release of its extend6 lineup of COBOL-based
integration tools for XML--along with file system
extensions for handling large objects, such as images
and other multimedia, in record sizes as large as 64
megabytes.
- Find out more about GigaWorld IT Forum: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3646-1
- Read more about Micro Focus' Enterprise Server: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3649-1
- Find out more about LegacyJ's PERCobol: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3652-1
- Find out more about Acucorp's extend6 series: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3655-1
- One of my son's favorite T-shirts bears an IBM
COBOL logo, along with a cartoon drawing of a shark in
sunglasses and a quotation from one of my columns.
Though it's often called a dinosaur, I argued in that
column that COBOL better resembles the shark: It's
been ruling its niche since the beginning, with a
plausible challenger yet to emerge.
- Read Artur Reimann's "COBOL, Language of
Choice--Then and Now: Looking Back to Get a Glimpse at
the Future": http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3658-1
- But if you query Google with "Java," you'll get
almost 34 million hits. Ask Amazon.com about Java
books, and you'll find more than 2,000 titles. Do the
same things with "COBOL" as the search term, and
you'll get fewer than 1 million Google results and
only 700 books. I wonder what language is being used
to cut the paychecks of all those Java coders?
- To paraphrase the old proverb about
accomplishments and credit, there's no limit to how
much work a language can do if its vendors and
practitioners don't care who gets the buzz. But if the
COBOL community wants to know why it gets so little
street cred, perhaps it should take a collective look
in the mirror. Follow that first Google hit to The
COBOL Center, and you'll find a "COBOL News" list
whose leading items are mostly past their first
birthday: Of the six news items on that home page as
of last week, only one bears a 2003 date. Hype alone
is certainly not sufficient, but a modest amount may
be a necessary catalyst to the production of future
talent.
- Check out The COBOL Center: http://eletters.eweek.com/zd/cts?d=79-31-6-7-128123-3661-1
- It takes a while to refill the pipeline of
critical skills, after we notice that it's running
dry. If we're going to need people in, say, 2008 who
have current knowledge of the Internet and the Web,
practiced skills in writing COBOL code that can use
those network resources, and five to 10 years of
experience in leading a development team, now is not
too soon to start developing those assets.
From ACM
News, April 12, 2003
"Cobol Enters the 21st
Century" InformationWeek (05/05/03);
Babcock, Charles
- Cobol has been modified with new object-oriented
features that will make it easier for companies to
integrate Cobol-based applications with other systems.
Although some observers believe it will take some time
for Cobol programmers to learn the object-oriented
programming methods of Cobol 2002, experts says the
new standard ultimately will make it easier to develop
discrete modules of code that interoperate with other
systems. Don Schricker, chairman of the standards
committee that served a key role in having Cobol
modified and adopted by the International Standards
Organization and the International Committee for
Information Technology Standards, calls the changes to
the programming language long overdue and "the biggest
change ever in Cobol." Cobol 2002 comes at a time when
the number of Cobol programmers is on the decline, and
the number of Java programmers is expected to surpass
the number of Cobol programmers in early 2004. Still,
200 billion lines of Cobol code remain in use and 30
billion Cobol transactions are executed daily. Click
Here to View Full article
From ACM
News, April 9, 2003
"Dream Code" Economist (04/03/03)
Vol. 367, No. 8318, P. 73
- The European Physical Journal recently accepted a
paper by Stefano Bettelli of Paul Sabatier University
detailing his and his colleagues' efforts in creating
a programming language for a quantum computer. A
quantum computer's bits, or qubits, simultaneously
exist in "0" and "1" states, enabling parallel
calculations. The act of measuring a qubit's value
triggers a collapse to a 0 or 1 state, while in
principle a well-organized quantum computation should
prevent this from happening until it becomes necessary
to learn what one of the qubit's values is. Dr.
Bettelli and his colleagues have organized a
programming language composed of quantum registers and
quantum operators--the former are supposed to allow a
program to interact with specific qubits, while the
latter facilitate qubit manipulation. The quantum
operators are the quantum version of logical
operators--"and", "not", and "or"--that form the
foundation of classical programming. In order to
usefully describe the program's unitary
transformations, Dr. Bettelli employs object-oriented
programming, which integrates data and commands into
individual bundles, or objects. Using an object to
represent a unitary transformation makes it relatively
easy to translate classical programming directives
into quantum-level physical control instructions.
Quantum registers and operators will have to be
combined with classical computations by the quantum
programming language. Click
Here to View Full Article
From ACM
News, April 4, 2003
"Mainframe Brain Drain
Looms" Computerworld (03/31/03) Vol. 37, No.
13, P. 1; Thibodeau, Patrick
- In an effort to staunch an expected hemorrhage of
mainframe expertise, the Association for Computer
Operations Management (AFCOM) plans to launch a Data
Center Knowledge Initiative that AFCOM's Brian Koma
says should spur IT managers "to take some early
action" so they can avoid rising costs of training new
staff to replace retiring mainframe talent. A 2002
Meta Group Study estimates that 55 percent of IT
workers with mainframe skills are over the age of 50,
which will lead to a sizable shortage once they
retire. AFCOM's solution, announced at its semiannual
conference in Las Vegas, is to offer online
undergraduate and certificate courses in data center
skills and build a best-practice knowledge base
containing know-how donated by data center managers.
The online courses would be supplied in collaboration
with Marist College in Poughkeepsie, N.Y. Participants
would also be able to receive hands-on training at
semiannual AFCOM events. Mainframe skills are not a
widely taught subject, and companies have to pay an
average maximum cost of $30,000 to $50,000 to train
each new employee. The other side of the equation are
companies that are eliminating their mainframes for
more advanced systems, which leads to costs associated
with getting mainframe operators up to speed on new
technology. Click
Here to View Full Article
AFCOM
EYES PROGRAM TO TRAIN IT WORKERS FOR MAINFRAME WORK
- With its Data Center Knowledge Initiative, the
leading data center professional association wants to
help companies train and attract IT professionals in
mainframe work. http://www.computerworld.com/careertopics/careers/skills/story/0,10801,79728,00.html
From Edupage,
March 21, 2003
REPORT SHOWS SHRINKING DIGITAL DIVIDE: A
report released March 19 indicates that the digital
divide in the United States is shrinking as children
from all ethnic groups and income levels increasingly
use the Internet. The Corporation for Public
Broadcasting reported that children under 17 spend
nearly as much time using computers as watching
television, with Internet use among minority and
low-income children surging over the past two years.
More than two-thirds of low-income households have a
computer at home, compared to fewer than half two
years ago. Gaps persist, however, particularly with
respect to high-speed Internet access at home.
Washington Post, 19 March 2003: Read
article
From ACM
News, March 19, 2003
"Setting a Course for Shipshape
Software" Financial Times--FTIT Survey
(03/19/03) P. 7; Newing, Rod
- Economic belt-tightening means companies must
better utilize resources, and this traditional concept
is carrying over into the IT department in the form of
more efficient software that is focused on business
performance. Whereas free-wheeling IT departments
simply added hardware and hashed out quick software
during the economic boom, they are now paring down
their software in order to cut back on hardware costs.
Gartner's Andy Kyte explains that hardware was
relatively cheap during the boom times while
programmers were expensive, but now the inverse is
true. Rational Software general manager Greg Meyers
says many firms do not have good software development
practices and leave testing till the end of the
production cycle, when it is most expensive and
difficult. Existing software applications can also be
refined, but that requires making inner workings
visible so managers can monitor performance, according
to Wily Technology's Lewis Cirne. Business technology
optimization (BTO) is also emerging as a way companies
can get more value out of their IT systems. Such
software solutions provide managers with dashboards
monitoring business-related metrics, such as orders
processed and on-time deliveries, instead of more
technical metrics such as ERP transactions and
database updates. Accenture's Tim Murfet says it is
critical for companies to closely link their IT
systems with actual business value, and says BTO is
one tool helping IT departments make that association.
From ACM
News, March 17, 2003
"Turning Out Quality" eWeek
(03/10/03) Vol. 20, No. 10, P. 22; Fisher,
Dennis
- Carnegie Mellon University fellow Watts Humphrey
is espousing his Team Software Process (TSP) and
Personal Software Process (PSP) as new software
development methodologies that can help improve the
quality of code while getting projects out quickly.
Often, he says, management sets unrealistic goals for
programming teams, which leads to disorderly plans and
haphazard work. Projects should be started earlier in
order to avoid this scenario, allowing programmers to
use structured development methodologies such as TSP
and PSP, which were developed at Carnegie Mellon's
Software Engineering Institute. While blame for
programming errors traditionally falls upon developers
and testers, Humphrey says it is the development
process itself that is often to blame for high error
rates, which he estimates at about one defect per 10
lines of code, even with experienced programmers.
Moreover, training regimens at many software firms
simply re-emphasize outdated methodologies. Demand for
usability and functionality trumps security when CIOs
and IT managers decide to buy software, and software
makers respond by focusing on those factors that
produce commercial results. Microsoft uses Carnegie
Mellon's methodologies for some of its internal
programming, including one recent application with
24,000 lines of code. Microsoft senior program manager
Carol Grojean says the methodologies should cut errors
drastically, down from 350 errors in the last version
to just about 25 errors in the latest iteration.
Microsoft, however, does not plan to employ the
methodologies for its commercial products. http://www.eweek.com/article2/0,3959,922974,00.asp
From ACM
News, February 14, 2003
"Are Developers Programmers or
Engineers?" InfoWorld.com (02/12/03); Krill,
Paul
- At the recent VSLive show in San Francisco,
industry veterans Alan Brown and Alan Cooper discussed
many of the problems endemic to software project
management. Cooper, who is thought to have fathered
the Visual Basic programming language, said that
software programmers are frequently mislabeled as
engineers, the difference being that engineers find
solutions while programmers implement them. "Web
designers are called programmers, programmers are
called engineers, and engineers are called architects,
and architects don't seem to ever get called," he
exclaimed. Brown, who directs Rational Software's
Rational Development Accelerator Initiative, noted
that productivity could be significantly improved
through component-based development so that software
can be reused across disparate projects. He argued
that at one end of the project management spectrum are
projects developed according to individual input, and
at the other are teams that operate consistently and
predictably, and follow an unchanging life cycle.
Cooper said that software developers generally work
without supervision, which is a recipe for disaster;
he advised that software projects should include
people who have users' needs in mind to keep
developers on track. Furthermore, he pointed out that
an "adversarial relationship" between managers and
programmers often leads to inaccurate time estimates.
http://www.infoworld.com/article/03/02/12/HNproject_1.html
From ACM
News, February 10, 2003
"NASA Leads Efforts to Build Better
Software" Computerworld Online (02/07/03);
Thibodeau, Patrick
- The 1999 crash of the Mars Polar Lander, which was
attributed to a software bug, made NASA officials
realize that preventing a similar embarrassment would
require an upgrade in software quality and the
development of failure-proof systems. Following the
crash, then-head of the NASA Ames Research Center Dr.
Henry McDonald advised the agency to get more
private-sector parties involved in its
dependable-system initiative, and NASA did so by
inviting top universities to participate in a
collaborative effort. Furthermore, NASA plays a key
role in the Sustainable Computing Consortium (SCC),
which supports the development of software that always
fulfills its function regardless of bugs, a
breakthrough that could benefit all industries. One of
the major problems this effort faces is the lack of
definitive software reliability metrics. NASA and
Carnegie Mellon University are jointly working on a
software architecture that provides reliable
computing, known as the High Dependability Computing
Program. "A bad way to approach any kind of design is
to look at it monolithically, to lump everything
together and consider all the problems at once," notes
head of Carnegie Mellon's West Coast campus, Dr.
Michael Evangelist. "NASA is looking at ways to
modularize design so you can focus individually on
important things." The recent destruction of the space
shuttle Columbia has revived interest in NASA's
computer systems and software. Click
Here to View Full Article
"Proving IT" CIO Insight (01/03)
Vol. 1, No. 22, P. 40; Duvall, Mel
- Prior to committing to big IT investments,
companies are now setting up test labs to prove the
business value of tech projects. The in-house testing
facilities are usually overseen by CIOs and other IT
executives, while personnel are made up of business
people and technologists. Furthermore, solid business
results must be demonstrated within 90 days or sooner.
Stevens Institute of Technology professor Jerry
Luftman notes that IT labs must follow several
guidelines, including instituting management by both
IT and business; keeping projects at 90- to 120-day
timeframes; and being careful not to take on too many
projects at once. Information Economics President
Sunil Subbakrishna adds that scalability issues should
be considered for every project, and technology must
be properly engineered. Teamwork is also a requirement
for most IT labs, where project success often hinges
on the alignment of IT and business leaders with
business goals. Bell Canada's Centre for Information
Technology Excellence (exCITE!) lab evaluates proposed
projects using a team of business and technology
managers, and then develops them under strict
regulations. Bell has tested and approved of many IT
initiatives through exCITE! that have added up to over
$25 million in savings and $10 million in additional
revenue. Twenty-five percent of computer and
communications manufacturers and 21 percent of
government IT executives polled by CIO Insight report
that they are beginning to require more pilot
projects. http://www.cioinsight.com/article2/0,3959,841192,00.asp
From ACM
News, February 7, 2003
"New Chapter in Success Story" Financial
Times (02/05/03) P. 6; Merchant, Khozem
- India's IT market is soaring due to weak markets
in mature economies, such as in the United States,
which is driving software programming and mundane
business processing overseas. Indian firms deliver
quality code, such as at Infosys, where programmers
average less than two errors per 1,000 lines of code.
Meanwhile, last year's fears about an imminent war
between India and Pakistan have dissipated,
re-igniting growth in the Indian IT market. Infosys
CEO Nandan Nilekani says, "Every twist and turn
overseas in recent years...has been good for Indian
IT." But although revenues and employment rolls are
growing rapidly, Nasscom President Kiran Karnik says
companies need to differentiate themselves based on
other aspects besides cost. Major IT firms in
India--Wipro, Tata Consultancy Services, and
Infosys--are ramping up their services arms in an
effort to offer broader solutions, even as global IT
service providers such as Accenture and Cap Gemini
Ernst & Young build up their own Indian
components. Bombay-based software interest Mastek has
taken a different route, partnering with Deloitte
Consulting in a joint venture. Other concerns worrying
Indian firms include increased scrutiny of overseas IT
outsourcing in the United States, as well as the rise
of eastern Europe as a low-cost competitor. Tata
Consultancy Services' S Ramadorai notes that legal
concerns about data protection have arisen in the
United States after Sept. 11, 2001.
"Transforming IT" Optimize (01/03)
No. 15, P. 20; Allen, Bruce
- IT is essential to business productivity, yet many
corporate IT departments have not properly deployed
the processes and metrics needed to optimize their IT
efforts. To change this, some enterprising CIOs are
following a three-year, seven-step transformation
model with an iterative strategy. The first step is to
focus on products and pricing in order to build a list
of key services representing IT's value proposition;
the second step of process refinement involves
recognizing the processes to be included in a catalog,
thus mapping out all operational factors; the third
step is the creation of centers of excellence (COEs),
which initially revolves around identifying processes
that have the strongest relationship and need the
tightest integration. The fourth phase, metrics
requirements, will be driven by COEs, based on their
needs and those of process, product, and service
fulfillment. Three types of metrics--business results,
human capital, and unit cost--and four types of
performance results--financial, maturity, performance,
and project--must be measured. Rapid assimilation, the
fifth step, will allow the IT department to deal with
unexpected projects and workloads and minimize
operational disruption by deploying a formal
structure, while the sixth step, organization, relies
on the identification of COEs, products and services,
and processes. The seventh and final step is the
development of a game plan, which should yield a clear
idea of the requirements for IT transformation as well
as a solid foundation for future initiatives.
Continuous improvement should be implemented across
all levels of the transformation model, while CIOs can
align their transformational strategies with their
human capital through the establishment of
human-capital management centers of excellence. http://www.optimizemag.com/issue/015/management.htm
From ACM
News, February 5, 2003
"What Python Can Do for the
Enterprise" NewsFactor Network (02/03/03);
Brockmeier, Joe
- The open-source, object-oriented programming
language Python is ideal for companies that need
flexible code for use on a variety of platforms, but
do not have a lot of programming resources. Python was
created in 1991, the same year as Linux, and named
after Monty Python's Flying Circus, says creator Guido
van Rossum. He says Python is supported by nearly
every computing platform, even the Palm operating
system, and its applications port very easily from one
platform to another. Python also allows programmers to
finish their work faster, with short edit-and-test
cycles and little code, making them more productive.
Van Rossum points out, however, that Python is not
suitable for performance-critical duties, such as
device drivers and operating systems, but is ideal for
rapid development projects, or for creating sparsely
coded programs when mixed with other languages.
Jython, a combination of Python and Java, allows Java
applications to be developed quickly, for example.
Other famous examples of Python code are the Apache
Toolbox and the Oak DNS server. While fast development
and simple coding are a boon to programmers, those
aspects are even more important to people who are not
full-time programmers. ActiveState senior developer
David Ascher says scientists favor Python because it
reduces their dependency on professional programmers
and gives them more control over their own projects.
http://www.newsfactor.com/perl/story/20645.html
"Can't We All Just Get Along?" IEEE
Spectrum (01/03); Cass, Stephen
- Software companies feeling the pinch from the
collapse of the dot-com bubble could achieve
significant growth from businesses seeking to squeeze
the most efficiency out of existing enterprise
software; to aid them is an industry-wide push to make
software more interoperable. Incompatibility has long
been a staple of the software industry, given rival
vendors' penchant to enhance basic products with
special features for competitive advantage, to the
point that they can no longer interoperate. However,
by the late 1990s, many companies realized that
homogenizing systems to one platform was impractical,
and the advent of XML was a great leap forward to
compatibility. In addition to being platform
independent, XML schemas supply some minimal certified
information about the data contained, and there is an
XML schema database for developers to use if they get
confused. The latest effort is to establish
interoperability between software applications that
use other applications, and this has resulted in plans
to develop Web services. Leading software makers are
moving to set interoperability standards by
collaborating in consortia such as the Web Services
Interoperability Organization (WS-I). However, Scott
Valcourt of the University of New Hampshire's
InterOperability Lab cautions that conforming to a
standard does not guarantee compatibility without
detailed review of the standard's written
specification; otherwise, ambiguities that go
unnoticed may cause different vendors to deploy the
standard in different ways. The formal standards
process has also drawn fire for its requirement that a
consensus be reached between rival vendors with a long
history of distrust. http://www.spectrum.ieee.org/WEBONLY/publicfeature/jan03/soft.html
From ACM
News, November 25, 2002
"Retooling the
Programmers" InformationWeek (11/18/02) No.
915; Ricadela, Aaron
- Aspect-oriented programming seeks to relieve
companies of many headaches, such as the intense
difficulty programmers face in converting the needs
and ideas of non-technical personnel into usable code,
as well as organizing and updating vast numbers of
scattered code fragments dedicated to
computing-critical policies. The result of over 10
years of research by IBM, the Palo Alto Research
Center (PARC), Northeastern University, and the
University of Twente, aspect-oriented programming aims
to reduce the complexity and size of software programs
by sharing and reusing more code across their
components. The proliferation of the methodology
throughout the mass market could lead to better
software quality and reduced IT and maintenance costs,
since it would automate more of the discourse between
developers and businesspeople. "Aspect-oriented tools
force you to think at a higher and more concrete
level," declares JPM Design's Juri Memmert. "If you
don't do that, then you're basically hosed, no matter
what you're using." PARC issued an update this month
to version 1 of its Java-based AspectJ software, the
development of which was financed by the Defense
Advanced Research Projects Agency (DARPA). About 12
organizations, including the U.S. Air Force, Siemens,
and Sirius Software, use the application in a
commercial capacity. Other aspect-oriented programming
tools currently available or under development include
IBM Research's HyperJ and Cosmos. http://www.informationweek.com/story/IWK20021114S0020
From ACM
News, November 1, 2002
"Federal Workers Closing IT Skills
Gap" InternetNews.com (10/29/02)
- The skills gap between government and private
sector IT workers is narrowing, suggests a study by
Brainbench, a provider of online tests. The study
looked at the scores of 4,110 federal and 7,096
private sector employees who had taken Brainbench's
online tests. Federal workers generally outperformed
their private sector counterparts in the fields of
Unix, Linux, and Microsoft technologies, while private
sector IT workers fared better in such areas as
networking, databases, and Internet technologies.
Brainbench head Mike Russiello says the improvement
shown by governmental IT workers, particularly in
Unix/Linux, can be attributed partly to efforts by the
Chief Information Officer's Council and the National
Academy of Public Administration. He also favors the
passage of the proposed Davis Digital Tech Corps Act
of 2002, which involves the exchange of midrange IT
workers between agencies and private firms. The
exchanges would last from six months to 24 months and
participates keep their normal salary and benefits for
the duration of the swap. http://dc.internet.com/news/article.php/1490361
From ACM
News, August 14, 2002
"What Does the Future Hold for
COBOL?" ZDNet Australia (08/08/02); Mante,
Keith
- Despite the hype surrounding Java, XML, .NET, and
Websphere, enterprises are trying to find ways to
maximize their mission-critical COBOL applications to
get the most out of their IT investments. Gartner says
that as much as 70 percent of active business
applications worldwide are written in COBOL and
increasing by some 5 billion lines every year.
Moreover, programmers still need to maintain post-Y2K
COBOL code. Another important point is that COBOL
applications are starting to become linked with
Internet-based applications. Rather than rewriting
applications, which is often impractical and not
commercially viable, enterprises can port the
applications to other platforms or combine the legacy
applications with Web services technologies. COBOL
programmers will ideally have such skills as XML and
Java to bring together COBOL developments, e-business,
and Web-based applications. http://uk.news.yahoo.com/020808/152/d6xfm.html
|