This judgment text has undergone conversion so that it is mobile and web-friendly. This may have created formatting or alignment issues. Please refer to the PDF copy for a print-friendly version.

IPTE Asia Pacific Pte Ltd v JMA Technologies Pte Ltd
[2005] SGHC 192

Case Number : Suit 1051/2003
Decision Date : 12 October 2005
Tribunal/Court : High Court
Coram : Kan Ting Chiu J
Counsel Name(s) : Tan Teng Muan (Mallal and Namazie) for the plaintiff; Troy Yeo Siew Chye (K K Yap and Partners) for the defendant
Parties : IPTE Asia Pacific Pte Ltd — JMA Technologies Pte Ltd
Contract  – Contractual terms  – Implied terms  – Plaintiff selling test systems to defendant under supply agreement  – Whether test systems defective and unfit for purpose of purchase

Contract  – Contractual terms  – Warranties  – Supply agreement containing warranties that test systems having no correlation with another system and fulfilling test time  – Whether plaintiff in breach of warranties

12 October 2005

Judgment reserved.

Kan Ting Chiu J:

1          This is essentially a claim for the price of goods sold and delivered. The plaintiff is a supplier of test systems for testing printed circuit board assemblies (“PCBAs”), and the defendant is one of its customers.

2          The parties entered into a supply agreement dated 10 December 2002[note: 1] (“the agreement”) for the plaintiff to sell to the defendant two sets of Rohde & Schwarz CMD55 test systems (“the R&S systems”) and for test fixtures (“the fixtures”) manufactured by the plaintiff itself. The systems and fixtures were intended to be used together to test PCBAs for mobile phones that the defendant was producing.

3          The test systems were delivered to the defendant in December 2002 and were commissioned and accepted by the defendant on 26 June 2003. When the equipment was delivered, the R&S systems and the fixtures were regarded as integral parts of a test system.

4          The plaintiff’s claim was that the defendant had failed to make full payment for the equipment delivered to the defendant, and that the defendant had failed to place an order for another fixture which it had contracted to take up.

5          The defendant denied liability to pay the plaintiff on the grounds that:

(a)        the test systems supplied were defective and unfit for the purpose for which they were purchased;

(b)        the test systems supplied were in breach of the warranties of the supply agreement.[note: 2]

6          The Defence, which was amended three times, left much to be desired by way of clarity. I referred to the defendant’s closing submissions to get a better understanding of the Defence although the submissions themselves were not easy to read. Paragraph 12 of the submissions stated:[note: 3]

(1)        that there were indeed correlation issues between the Defendant’s Agilent tester and the [R&S] testers;

(2)        that the Defendant’s use of the cpk standard in determining whether there existed any correlation issue between the Agilent tester and the [R&S] testers, is based on industry standards;

(3)        that the test results of the Agilent and the [R&S] testers used by the Defendant to support its case are authentic;

(4)        that even as late as August 2003, the Plaintiff had failed to commission the testers that they sold to the Defendant;

(5)        that the Defendant had never at any time delayed commissioning by the Plaintiff and that the Defendant had been cooperative towards the Plaintiff throughout (pre and post Supply Agreement) in order to assist the Plaintiff in the commissioning of the [R&S] testers;

(6)        that the Com-port, formatting and battery calibration problems in the Defendant’s PCBs were not inherent product defects; rather these were run of the mill issues that manufacturers of product testers can expect to face in the normal course of events and therefore, it was the Plaintiff’s responsibility to overcome these problems when they calibrate the testers and/or that they did not delay or affect commissioning;

7          The defendant put up a counterclaim. It claimed that the ten-second slowness of the test process of the R&S systems led to a slowdown in production resulting in a loss to the defendant of US$2,820 a day, for a period of 11 months, totalling US$930,600. It also claimed that as a result of the unsatisfactory performance of the systems delivered by the plaintiff, it lost an order from Top Rank Cosmos Sdn Bhd worth US$4,104,700 and incurred a loss of profit of at least 10% or US$400,000. Lastly, it also claimed that it was entitled to a refund of US$48,692.33 it had paid to the plaintiff for the systems.

8          The plaintiff denied the counterclaim and pleaded that while it was prepared and entitled to attend to and perfect the operation of the systems under cl 4.3 of the agreement, the defendant did not allow or want that to be done.

9          I will deal with the “correlation issues” first (as pleaded, they were the correlation issue proper, and the test-time issue), as they arise from the express warranties in the agreement for the supply of the systems. The other issues relate to the terms on quality and fitness implied under s 14 of the Sales of Goods Act (Cap 393, 1999 Rev Ed) (“SGA”). I will refer to them as the implied terms issues.

The correlation issue

10        The warranties read:

Test System

-           Warranty no correlation issue with Aligent GS8000

-           Warranty to achieve 2mins 40sec test time for Nautilus GSM phone, with respect to IWOW test instructions.

(IWOW is iWOW Communications Pte Ltd, the design house which assisted the defendant in its production of the PCBAs and handphones).

11        The defendant complained that:

The acceptance tests that were conducted during the ‘attempted commissioning’ by the Plaintiffs revealed that the test systems (including equipment and fixtures) were below 100%. A test systems [sic] that is below 100% is unfit for the use of any production by the Defendants (manufacturers of hand-phones) …

and

To ascertain whether the Plaintiffs’ test system (including equipments and fixtures) complied with the terms and conditions in the said Supply Agreement, the Defendants, from reading the test data supplied by the Plaintiffs concluded that the said system could pass only 47 of the 60 steps (as test parameters). This differed from the test conducted by the Defendants on the Agilent GS8000 (presently operated by the Defendants). For the test conducted on the Agilent system, it passed 63 of the 63 steps (as test parameters). The results indicated that there was a correlation issue between the 2 systems and hence, in contravention of the terms and conditions of the said Supply Agreement.

and that the R&S systems did not fulfil the 2min 40sec test time, and could only achieve 2min 50sec.

12        The defendant had asked for a warranty that there was to be no correlation issue.[note: 4] The defendant had been using the Agilent GS8000 testers to test their PCBAs. When it considered buying the R&S systems recommended by the plaintiff, the defendant was anxious that the R&S systems would perform as well as the Agilent systems.

13        The correlation question had arisen in the course of negotiations. The defendant’s business development manager, Wee Hwa, sent an e-mail to Francis Cheong Chee Heng, a director of the plaintiff, on 30 August 2002 stating:

I want an assurance from IPTE [the plaintiff] that my production will not have any correlation issues between Agilent GS8000 and R&S equivalent tester. That’s mean all product units test under Agilent GS8000 and R&S Tester will showing the same result with accepted system tolerance. No variant in terms of test result of “pass” and “fail” shall be tolerance in R&S Tester with Agilent GS8000. Appreciate your comments on this. JMA will only proceed with R&S Tester if we feel comfortable with the above correlation issue being address. … [emphasis added]

and received a reply:

I would like to re-assure that you will not have any correlation issues between Aligent GS8000 and R&S tester. Both systems will provide you same “passed” and “failed” results, within accepted system tolerance. [emphasis added]

14        In the trial, Wee Hwa was not called as a witness by the defendant. Ong Chee Huck (“Ong”), the defendant’s chief technology officer, gave evidence on the defendant’s behalf with regard to the warranty.

15        Ong explained the defendant’s understanding of the correlation issue thus:

I will break it into two crucial point, two crucial points. Point number one, when I am testing my PCBA on the Agilent, if it is pass and I take the same PCBA, I test on the R&S tester, and it also pass, we will allow it acceptable system tolerance. We will allow it, an acceptable system tolerance.

In the event that when I test the PCBA on the Agilent and it pass, and this same PCBA, if I test it on the R&S tester and it failed, we do not allow any tolerance at all.

In basic terms, if there is a “pass” result for the Agilent system and the same result from the R&S system, ie, pass/pass, there is no correlation problem. Likewise, if there is a “fail” result from both systems, ie, fail/fail, there is no problem. But if the results are mixed, ie, pass/fail or fail/pass, that is a problem. I will call this the “pass/pass fail/fail” test.

16        This perception of correlation was a minefield of complications. There are serious doubts whether the defendant had considered the matter fully and had a clear understanding of it.

17        First, the Agilent GS8000 is a test system, it is not a standard. Its performance is subject to variations and deviations from absolute accuracy. When another system such as the R&S systems show discrepant results from the Agilent system, the R&S systems may be performing better than the Agilent system and be giving more accurate results.

18        Second, the defendant was operating two Agilent systems. There was no evidence that they performed uniformly or that there were no correlation issues between them. If the performance of the Agilent system is to be the standard against which the R&S systems are measured, which of the two systems are the R&S systems to correlate to?

19        Third, the tests must be conducted properly. The Agilent and R&S systems’ performance should be measured against stable devices under test (“DUT”) to ensure that any divergences of test results come out of the test systems and not from the instability of the DUTs. The defendant did not use independently-tested DUTs for the tests.

20        The fourth problem was that there was little evidence on the process by which the test results were obtained. In its Further and Better Particulars, the defendant had stated that the Agilent test results were obtained in “June and July 2002” by the defendant’s engineer, Ms Ching Lee Kia (“Ching”), in the course of normal operation. However, she was not called as a witness, and when Ong gave evidence, he said that the date was wrong, and it should have been in June 2003 when the R&S systems were commissioned and accepted by Ching on behalf of the defendant. It is obvious that the year 2002 was wrong because that predated the delivery. However, the shortening from “June and July” to “June” was not explained. The Agilent and the R&S systems’ test results were collated by Ching into a table.[note: 5] This table set out the test results of one Agilent system and the R&S systems from five DUTs. There was no confirmation that all the available test results were collated. To the contrary, in the course of the cross-examination of Ong, there was evidence that there was some selection amongst the results.

21        There is a fifth problem. The strict “pass/pass fail/fail” test propounded by the defendant is not rational. Both parties agree that systems do not work with absolute accuracy, and that there are accepted system tolerances. As stated earlier, all test systems operate within accepted degrees of tolerance. If the verified standard is Xº and the accepted tolerance for the Agilent and the R&S systems is 5º, then a test reading of between X–5º to X+5º for the standard is considered as a “pass” and anything outside of the range is a “fail”. If the Agilent system gives a result of X+5º and the R&S system gives a result of X–5º, both are “pass” results, there is no correlation issue although there is a 10º difference between the two test results. But if the Agilent result is a “pass” at X+5º, and the R&S result is X+6º, and is a “fail”, ie, a “pass/fail” result, there is a correlation issue although the difference between the two results is only 1º. A criterion that accepts a 10º deviation and rejects a 1º deviation is plainly flawed. The correlation criteria must take in accepted system tolerances to make sense. The “pass/pass fail/fail” test must be applied subject to accepted system tolerances.

22        The defendant called Dr Tan Guan Hong (“Dr Tan”), an electrical engineer in the manufacturing and test system industry for 25 years, to give support to its case. Dr Tan was consulted at a late stage of the dealings between the parties, and did not examine or work with the PCBAs or the Agilent or the R&S systems. He rendered his report on the basis of two pages of compiled Agilent and R&S test results the defendant had supplied to him without showing him the original full test results from which the data in the two sheets were extracted. He did not know if the Agilent test results came from one or both of the defendant’s Agilent test systems.[note: 6] When the plaintiff’s counsel showed him that some unfavourable original source data were left out in the compilation, he accepted that if the data was included it would affect the analysis of the test results.[note: 7]

23        His report stood out for its brevity. It read:

Subject: Data Analysis for co-relation of test results

Based upon the data which was given by you, the analysis of the Agilient and IPTE [R&S] testers are shown in EXCEL format with the critical parameters highlighted.

The test results for some parameters for the IPTE [R&S] exceeds the test limits and hence the Cpk are negative. As the test results show negative data, the testers passed the Agilient tester but fail on the IPTE tester.

The summarized report is attached together with the EXCEL printout [the Agilent and R&S test results].

24        It was also noteworthy for its non-disclosure of (a) the exact test parameters and test limits adopted which brought about the negative Cpk (Process Capability index) results, and (b) what the negative results were. Augustine Yap (“Yap”), the defendant’s engineering manager, had mentioned a 1.3 Cpk standard to the plaintiff,[note: 8] but Dr Tan exhibited in his report an extract from Quality Control (Prentice Hall, 3rd Ed, 1990) by Dale H Besterfield[note: 9] that:

A Cpk value of 1.00 is a de facto standard. It indicates that the process is producing product that conforms to specifications.

25        Nevertheless, the defendant’s counsel submitted that a Cpk value of 1.3 is the ideal and that it is “something so fundamental to engineering that there is hardly any need to insert it into the contract in the present case.”[note: 10] Was he alluding to an implied term which application did not need to be pleaded? This submission succeeded at being hopeful and hopeless at the same time.

26        When Dr Tan was cross-examined by the counsel for the plaintiff he revealed that he was informed by the defendant that the DUTs used were not tested beforehand. Instead, he was told:[note: 11]

[S]ome PC boards are not functioning, some PC boards are not stable. So they get rid of all of that, so that you must have a stable PCBA, then at least you know it is stable. You measure this one, get one result, measure these two, measure this one. At least you get a bit more stable results. That is what I was told.

Dr Tan did not give his endorsement to this method of selection as an acceptable basis of establishing the stability of the DUTs used. Clearly DUTs must be independently tested and found to be stable before they can be used to test the systems.

27        He agreed that if independently-tested DUTs were used, that would eliminate product uncertainty, ie, the instability of the DUTs. He added that when product uncertainty is eliminated, the difference with test results of the different test systems will reflect tester performance.[note: 12] That implied that when untested DUTs were used, differences in the test results might arise from (a) product uncertainties in the DUTs, or (b) performance differences of the test systems, not necessarily the latter. This shortcoming threw serious doubts on the reliability of the test results as a measure of test system performance.

28        After reviewing Dr Tan’s evidence, I find that the circumstances of his engagement left much to be desired. He should have been told of the ongoing problems, and his advice should be sought on how proper conclusions were to be obtained, ie, whether tested DUTs were to be used, whether the Agilent tests system should be tested for stability of performance, and whether every test figure should be included in the collation of the test results.

29        The accumulated shortcomings present by the time the two sheets of collated test results were submitted to Dr Tan went to the root of the value and reliability of his opinion.

The test time issue

30        The defendant’s case on this issue fell apart when Ong was cross-examined, and he conceded:[note: 13]

Q.        Subsequently, are you aware that the test time was enlarged? Do you have any personal knowledge of this?

A.        Yes, I was … I had been briefed about this test time issue.

Q.        And that it was enlarged to more than 200 seconds, in fact about 240 seconds?

A.        Can you repeat the question[?]

Q.        You said you knew at the time, probably prior to contracting or thereabouts through discussions about this test time. My question to you is: do you have personal knowledge of events subsequent to that, particularly about the enlargement of test time to 240 seconds?

A.        Yes. Yes.

and when he was shown Dr Tan’s spread sheet:[note: 14]

Q.        You will see that the limits had been changed, and this is reflected in the test records for IPTE 1 and IPTE 2 [the two R&S systems], you see that the high limit is 240 seconds under IPTE 1, and similarly 240 seconds for IPTE 2; correct?

A.        Yes.

Q.        And that this was the revised test cycle time?

A.        Yes.[note: 15]

31        The defendant did not refer to this issue any more in the closing submissions.

The implied terms issue

32        There was a pervading vagueness in the presentation of the defendant’s case on these complaints. From the evidence, officers of the defendant who were most involved in the commissioning and testing of the R&S systems and liaising with the plaintiff were Yap and Ching. Neither of them gave evidence of the matters that arose between the plaintiff and the defendant. This was left to Ong, who had little or no direct involvement in or knowledge of those matters. Ong sought to rely on Dr Tan’s report without resolving the problems and shortcomings which I have referred to.

33        Thus, when Ong referred to Cpk, he stated that:[note: 16]

[O]n 16 and 30 June 2003 … the Defendants again emphasized the importance of correlation between the 2 systems, including with Cpk expectation. In manufacturing practice, Cpk is an objective evaluation method for correlation and this correlation is to be within Cpk expectation.

without identifying the Cpk expectation he referred to. While it was common ground that Cpk can be used to determine whether the testers were performing to expectation, there should be an agreed Cpk value. In the absence of that, it is as meaningful to say that the testers performed below Cpk expectation as it is to say that the test time is exceeded in the absence of any agreed test time.

34        The plaintiff, on the other hand, adduced evidence from persons who had first-hand knowledge of the matters. Eduard Verhoeven (“Verhoeven”), the managing director of the plaintiff, gave evidence that he was directly involved in the plaintiff’s negotiations with the defendant on the supply of the test systems, as well as the dealings between the parties after the systems were delivered.

35        He deposed that the defendant had not complained of communications error, or that four out of five PCBAs tested failed to complete the test cycle because of communications error. Even when he met the defendant’s representatives on 8 August 2003, communications error was not raised,[note: 17] and his evidence was not challenged on this point.

36        Denis Berthold (“Berthold”), an engineer assigned by the plaintiff to attend to the preparation and commissioning of the R&S systems, also gave evidence. He became involved in October 2002, before the supply agreement was signed. He liaised with the representatives of the defendant and iWOW on the test program that was required, and discovered that they were unable to give him the necessary information and instructions. Eventually, he worked on his own and produced a 59-step test and calibration program.

37        When the systems were delivered in December 2002, they were run with the 59-step test program, and the results were presented to the defendant. There were no complaints about the test program.

38        At about the same time, they encountered a problem with the PCBAs. The PCBAs were unable to format, ie, store and retrieve information downloaded into them. He spent time from April to June 2003 to overcome the problem, although this work was outside the plaintiff’s scope of work and services.[note: 18]

39        After the commissioning, the defendant, through Yap, brought up the question of statistical expectations and Cpk of 1.3. This was the first time that statistical requirements of the results were raised. Berthold considered the fixing of Cpk at 1.3 to be arbitrary as no statistical criteria was set for the commissioning.

40        Berthold also deposed in his Affidavit of Evidence-in-Chief to complaints of excessive Phase Peak Error and Phase PMS Error. The complaints were based on results alleged to have been obtained from the R&S systems. However, Berthold disputed the accuracy and completeness of the results and asserted that when he carried out his own tests in his own office, the errors were within acceptable limits.

41        He concluded that the differences might be due to the fixed probes and RF antenna connection of the test fixtures. In August 2003, he proposed the addition of insulation to the antenna which improved its performance. He had also recommended that the fixed probes be replaced by floating probes, but the defendant did not return the fixtures for the modification to be made. Instead, the defendant responded through its solicitors to allege that the systems did not comply with the warranties and were unfit for the purpose they were bought, and demanded that the plaintiff take back the systems and refund the payments it had made.

42        Berthold was cross-examined at some length by Mr Yeo, counsel for the defendant, on the commissioning of June 2003 and the subsequent actions taken at attending to the defendant’s complaints.

43        That was difficult and unrewarding work for Mr Yeo because those complaints were based on test results from the R&S systems. The difficulty Mr Yeo faced and acknowledged was that the plaintiff did not accept those test results, and the defendant did not call the persons who ran the tests and obtained the results to show that the tests were done and the results were obtained properly.

44        On a review of the evidence, it can be seen that the commissioning of the test systems was a process which commenced in June 2003 and carried on in the months that followed. There were some improvements made in respect to some complaints and some disagreements over the validity of other complaints. In respect to problems that the plaintiff recognised, such as the antennae and probe problems, the plaintiff had attended to them to offer solutions, but its efforts were eventually rebuffed.

45        The plaintiff’s case was that cl 4.3 of the supply agreement entitled it to attend to the problems after delivery. Clause 4.3 provides that:[note: 19]

If the Customer rejects any delivery of the Goods which are not in accordance with the Specification during the Warranty Period, the Supplier shall at its sole discretion repair or supply replacement Goods or refund the price of such defective Goods if it has been paid or if it has not been paid, to relieve the Customer of the obligation to do so.

and the Warranty Period is the period of one year from the date of delivery.

46        The defendant’s response was that:[note: 20]

Clause 4.3 of the supply Agreement is only applicable if the Plaintiffs could show that they were able to rectify the ‘defects’ in accordance of [sic] the specifications. There is nothing to show that the Plaintiffs were able to do so and/or expeditiously.

47        However, this was contradicted by the evidence that the plaintiff was taking steps to get the systems to function properly, and was achieving some results, and was prepared to continue to attend to the glitches.

48        The complaints of correlation problems and failure to satisfy Cpk standards were not substantiated by any reliable evidence. Berthold was the only direct source of evidence on the problems, investigations, solutions and recommendations, and his evidence contradicted the defendant’s case.

49        The defendant had not shown to my satisfaction that there were correlation issues, or that the problems with the test systems could not be resolved in time if further attention was given.

The counterclaim

50        The defendant did not take its counterclaim seriously. It did not produce anything to support the counterclaim. The entirety of its evidence on the counterclaim was Ong’s assertion that:[note: 21]

The failure to commission and hand over expeditiously clearly vindicates the position that the Tester (Test System and Test Fixtures) was not merchantable and/or unfit for the purpose contracted for. This failure affected the Defendant’s business and contractual plans including their contract with Top Rank Cosmos Sdn Bhd (TR).

There was no evidence on how the pleaded loss arose, and no attempt to explain or justify the sums pleaded in the counterclaim.

Conclusion

51        Judgment shall be entered for US$310,385.01, being the unpaid price of the test systems and fixtures delivered with contractual interest at 1% per month from due date to the date of judgment, as well as damages, to be assessed, for the defendant’s failure to take up an additional test fixture. The counterclaim is dismissed.
 

52        The plaintiff shall have the costs of the action, to be taxed on the standard basis.



[note: 1]The date of the agreement is sometimes stated as 11 December 2002, but there is no dispute over the existence of the agreement.

[note: 2]Re-re-amended defence and counterclaim para 7(a)

[note: 3]Submissions of the Defendants para 12

[note: 4]Affidavit of Evidence-in-Chief of Ong Chee Huck para 10

[note: 5]DB236-237

[note: 6]Notes of Evidence page 176

[note: 7]Notes of Evidence pages 170-172

[note: 8]see PB183

[note: 9]Affidavit of Evidence-in-Chief of Dr Tan Guan Hong page 29

[note: 10]Submissions of the Defendants para 49

[note: 11]Notes of Evidence page 162

[note: 12]Notes of Evidence pages 154-156

[note: 13]Notes of Evidence page 96

[note: 14]DB235

[note: 15]Notes of Evidence page 97

[note: 16]Affidavit of Evidence-in-Chief of Ong Chee Huck para 18

[note: 17]Affidavit of Evidence-in-Chief of Eduard Verhoeven para 22

[note: 18]Affidavit of Evidence-in-Chief of Denis Berthold paras 37-39

[note: 20]Opening Statement of Counsel for the Defendants para 21

[note: 21]Affidavit of Evidence-in-Chief of Ong Chee Huck para 42

Copyright © Government of Singapore.

Back to Top

This judgment text has undergone conversion so that it is mobile and web-friendly. This may have created formatting or alignment issues. Please refer to the PDF copy for a print-friendly version.

Version No 0: 12 Oct 2005 (00:00 hrs)