Host-pathogen interaction between spring wheat and Septoria nodorum with reference to resistance breeding

Host-pathogen interaction between spring wheat and Seploria nodorum Berk, with applications for wheat breeding were studied. Ultrastructure of interactions was studied using electron microscopic techniques. Following inoculation, conidia of S. nodorum germinate, form appressoria and a penetration peg which directly penetrates through the cell walls. It is suggested that most penetration attempts fail because of cellular defence reactions, formation of papillae and cell wall alterations. Inoculation with low spore concentration reduced grain yield of Flankkija’s Taava cultivar by 10 % and 1000-grain weight by 14 %. Inoculation with high spore concentration on large plots of Tähti cultivar reduced grain yield by 32 % and 1000-grain weight by 18 %. Inoculation with high spore concentration on normal breeding plots of Tähti cultivar reduced grain yield by 35 % and 1000-grain weight by 21 % and the grain yield of Kadett cultivar by 27 % and 1000-grain weight by 20 %. Inheritance studies on F 2 progenies of spring wheat crosses involving susceptible and moderately or highly resistant parents suggest that heredity component of symptom expression is moderate level and breeding success depends mainly on efficient screening techniques. Resistance was associated with tallness in crosses, and cultivar trials suggest that resistance is positively associated with late maturation time. Field screening techniques based on small plots and artificial inoculation showed that the most resistant entries were wild Trilicum species and late and tall cultivars. Seedlingplant tests based on attached seedling leaves and detached leaves revealed easily the most resistant and most susceptible cultivars. The overall correlation between seedling tests and field tests was


Preface
The present investigations were carried out at the Departments of Plant Pathology and Plant Breeding, University of Helsinki. Part of the first field trials were carried out at the Hankkija Plant Breeding Institute, Hyrylä. Electron microscopic work was performed at the Department of Electron Microscopy, University of Helsinki.
I wish to express my deep gratitude to Professors Eeva Tapio and Peter Tigerstedt for their support and encouragement during the progress of this work.
I am grateful to Professors Eeva Tapio, Erkki Kivi, and Peter Tigerstedt, and Docent Kari Lounatmaa for constructive criticism and comments of the manuscript.
I am pleased to extend my best thanks to Drs Peter R. Scott  cultivar by 10 % and 1000-grain weight by 14 %. Inoculation with high spore concentration on large plots of Tähti cultivar reduced grain yield by 32 % and 1000-grain weight by 18 %. Inoculation with high spore concentration on normal breeding plots of Tähti cultivar reduced grain yield by 35 % and 1000-grain weight by 21 % and the grain yield of Kadett cultivar by 27 % and 1000-grain weight by 20 %.
Inheritance studies on F 2 progenies of spring wheat crosses involving susceptible and moderately or highly resistant parents suggest that heredity component of symptom expression is moderate level and breeding success depends mainly on efficient screening techniques. Resistance was associated with tallness in crosses, and cultivar trials suggest that resistance is positively associated with late maturation time.
Field screening techniques based on small plots and artificial inoculation showed that the most resistant entries were wild Trilicum species and late and tall cultivars. Seedling plant tests based on attached seedling leaves and detached leaves revealed easily the most resistant and most susceptible cultivars. The overall correlation between seedling tests and field tests was quite high.
The results are discussed in relation to wheat breeding strategies for resistance to S. nodorum.

Introduction
Current emphasis in agricultural research is to devote more efforts on finding methods to reduce production costs and the risk of environmental pollution. Genetic plant protection is cheaper for the grower than other forms of protection (Pesola 1930, Riley 1979, and it offers the ideal method to control plant diseases biologically (Hagedorn 1983). Plant breeding through the development of resistant cultivars has made remarkable contribution to controlling many important pests and diseases (Hagberg and Gustafsson 1981, Lupton 1984. For example, Hagedorn (1983) states that over ten years ago it was estimated that more than 75 % of the crop acreage in the United States was being planted with disease-resistant cultivars, with an annual value to growers of over a billion dollars. In addition, Doodson (1981) showed the significant economic benefits of growing winter wheat cultivars resistant to leaf diseases over a ten year period in England and Wales.
The roots of resistance breeding date back to the history of human cultural evolution, but the rediscovery of Mendel's work was the turning point for scientific resistance breeding. Biffen (1905) found that the resistance of wheat progenies to yellow rust followed Mendel's laws, and he suggested that cultivar resistance could be improved through selection. Scientific progress in resistance breeding over the past 80 years has not been a steady process but one where great momentary advances and long steady periods alternate, a phenomenon well known in many areas of science (Kuhn 1962).
The first era of resistance breeding was based almost totally on easily detectable types of resistance conferred by a few major genes. However, it soon appeared that resistance breeding was not always successful. The explanation for this was provided by Barrus in 1911 (ref. Ellingboe 1981) who reported that two isolates of Colletotrichum lindemuthianum possessed differential pathogenicity on bean cultivars. This was the beginning of the era of physiological specialization studies, and many methods developed for race surveys at that time are still used today.
The next peak in the ideology of resistance breeding was in 1946-1947 when Flor introduced his gene-for-gene theory. This was a turning point in the history of plant pathology, and the theory is still one of the most important discoveries in this field (Ellingboe 1981, Vanderplank 1982. Subsequent development based on analysis of several hostpathogen systems, and in many cases it was found difficult to detect a gene-for-gene pattern. Vanderplank (1963,1968) made a substantial contribution to the development of new ideas and lead the way to more theoryorientated analyses of host-pathogen interactions. He introduced the terms vertical and horizontal resistance, the former being racespecific and liable to breakdown by new virulent strains, the latter race-non-specific, stable and controlled by many minor genes. However, later studies (e.g. Nelson 1978, Parlevliet 1979, Ellingboe 1981) have challenged many of Vanderplank's hypotheses and shown that in many cases the definitions vertical and horizontal are far too simple to explain the genetic basis of durability of cultivar resistance.
In the 19705, resistance research has advanced in two main areas. The application of population genetics to epidemiology has given new ideas about the dynamics of pathogen populations, for example, about virulence genes and selection pressures caused by resistant cultivars (Person et al. 1976, Leonard 1977, Wolfe and Barrett 1977, Zadoks and Schein 1979. Over the last ten years great efforts have been made to reveal the biochemical and molecular basis of resistance and to shed light on the crucial problems of recognition in host-pathogen interactions (Friend andThrelfall 1976, Daly andUritani 1979).
Currently, some of the greatest expectations are focused on the applications of recombinant DNA techniques for resistance breeding (Ellingboe 1981, Foard et al. 1983, Comai and Stalker 1984. Genetic engineering can substantially speed up selection (Day 1984 b). For example, cDNA probes can now be routinely used in virus resistance screening (Flavell et al. 1983, Baulcombe et al. 1984. Further, better understanding of path-ogenicity and virulence can give new tools for resistance breeding.
Another area of much interest is the identification of primary gene products of single resistance genes (Foard et al. 1983, Kuhn et al. 1984 because the availability of purified DNA sequences is required for utilizing novel ways of introducing genes into plants by genetic engineering. It is probable that within the next five years genes coding for some single gene resistance could be transferred into agronomically accepted crop cultivars using vectors such as Ti plasmid or transposons. Research on cereal disease resistance in Finland has a long but discontinuous tradition. For example, Pesola (1927) made an extensive study on yellow rust resistance of spring wheat and Kivi (1956) on stem rust of spring wheat. Since those days breeding efforts have been concentrated on improving the level of powdery mildew resistance (Nissinen 1973). Furthermore, the breeding for resistance in winter cereals to fungi causing winter damage has long been an important research aim (e.g. Pohjanheimo 1962, Jamalainen 1969 2. Literature review on current ideas of host-pathogen interactions 2.1. Genetic aspects of host-pathogen interactions Resistance to many plant diseases is monogenically inherited (Russell 1978, Leonard 1984). The difference between resistance and susceptibility in segregation populations is clear-cut, and if resistance is dominant, it is easy to use in plant breeding (Russell 1978). However, host reaction to the disease can be more complicated. There is evidence of host resistance controlled by a few oligogenes and even polygenes (Day 1974), and if each of these genes has a small effect, the expression of resistance is continuous in segregation populations and its detection is difficult in plant breeding (Russell 1978).
When studying the inheritance of resistance in flax and the pathogen virulence of its obligatory parasite, Melampsora I ini, Flor (1946Flor ( , 1947 found that for each gene conditioning resistance in the host there was a specific matching gene conditioning virulence in the pathogen. Thus a resistant (incompatible) reaction occurs when the host has a dominant gene for resistance that is not matched by a corresponding gene for virulence in the pathogen (Table 1). This genefor-gene hypothesis suggests (Person andMayo 1974, Ellingboe 1982) that the allele for avirulence in the parasite and the corresponding allele for resistance in the host play a key role in determining the reaction that will be incompatible.
It has been suggested (Sidhu 1975) that the gene-for-gene pattern applies to a wide range of host-parasite interactions and Ellingboe  (1981,1982) has stated that about 95 % of all analyzed disease resistance follows this pattern. However, Day et al. (1983) argue on several grounds that in a sense gene-for-gene systems are the tops of icebergs. Further, Johnson (1983Johnson ( , 1984 gives several examples of host-parasite interactions where significant interaction between parasite genes and host genotype genes does not exist.

Determinants of pathogenicity and virulence
Plant disease is the outcome of interactions between the plant, pathogen and the environ-ment (Kosuge et al. 1983). The precise mechanisms by which a pathogen causes disease in plants are poorly understood (Kado andLurquin 1982, Daniels 1984). Disease response is related to pathogen pathogenicity, a qualitative term which defines the ability of an organism to cause a disease on its host, and virulence, a quantitative term which defines the severity of disease caused by the pathogen (Kosuge et al. 1983).
Classical genetics has shown (Day 1974) that there can be single, few or many genes coding for pathogenicity or virulence in the pathogen. Hitherto, very little is known about the primary products of such genes because the genetics of many pathogens is poorly known (Day 1984 a). However, it is known (Kado and Lurquin 1982) that the genes of the pathogen code for products which in turn directly or indirectly cause the disease response in the plant. Thus it appears that increasing attention should be paid to revealing the nature of pathogen and on trying to determine its inherent weakness (Kado and Lurquin 1982). It is apparent that the genes for virulence are there for the pathogen to grow and colonize the host while the genes for pathogenicity code for products that alter cell function (Kado and Lurquin 1982).
There are many pathogenicity and virulence determinants such as toxins, cell wall degrading enzymes, and polysaccharides demonstrated to be involved in disease response (Cooper 1983), but their precise role in disease mechanisms is poorly understood.
The application of recombinant DNA techniques to pathogenicity studies (Comai and Kosuge 1982) has recently made it possible to achieve considerable progress in revealing the nature of pathogenicity and virulence.
Recently, some workers have been able to isolate and clone genes coding for pathogenicity and virulence in bacteria, Erwinia sp. (Keen et al. 1984), Pseudomonas sp. Kosuge 1982, Staskawicz et al. 1984), and Xanthomonas sp. (Daniels et al. 1984). Furthermore, Yoder (1983) has been able to construct a gene vector amenable to pathogenic fungi for the cloning of genes coding for toxin production, and Soliday et al. (1984) were able to clone and sequence the gene coding for cutinase, an enzyme involved in the fungal penetration of plants. Thus, recombinant DNA techniques have already proved to be a powerful tool in pointing the way to a better understanding of the nature of plant pathogens. It is evident that the precise understanding of gene functions in virulence and pathogenicity will open new ideas for disease control.

Molecular aspects of interactions
Several models have been proposed to explain molecular mechanisms in gene-for-gene host-parasite systems (Day 1974, Albersheim and Anderson-Prouty 1975, Chakravorty and Shaw 1977, Ellingboe 1982, but no testable models have yet been published. It is assumed (Keen 1982, Callow 1984 that the high degree of specificity exhibited by genefor-gene systems suggests highly selective host receptors capable of detecting specific features of parasite races. The hypothesis now widely accepted (Keen 1982) is based on the idea that genetically determined early recognition phase appears before the invocation of the biochemical and histological events that stop the colonization of the pathogen. Keen (1982) has recently introduced the elicitor-receptor model to explain the molecular control of gene-for-gene interactions. The driving force in this model is the presence in plant cells of specific surface receptors, probably proteins or glycoproteins, that recognize the surface molecules of incompatible but not compatible pathogen races. Consequently, incompatible races initiate active defence, particularly phytoalexin accumulation, which inhibits the growth of the pathogen (Keen 1982). Evidence to support this model is at the moment limited. There are extensive data of elicitors isolated from fungal cell walls (West 1981, Keen et al. 1983, Darvill and Albersheim 1984 as well as from plant cell walls (Davis et al. 1984) which can induce phytoalexin accumulation. However, there is only one report of a receptor on plant membrane for a fungal elicitor .
Recently, Bell et al. (1984) provided molecular evidence that the induction of mRNA activities encoding enzymes of phytoalexin biosynthesis is a key component in the regulation of phytoalexin accumulation in relation to hypersensitive resistance in an incompatible interaction. Further, they suggest that the induction of chalcone synthase mRNA activity at the early stages of incompatible interactions represents an early biochemical event in a causally related sequence leading from genetically specified recognition in an intact host-pathogen system to operation of a defined well-characterized defence response. Application of recombinant DNA techniques to Pseudomonas syringae pv. glycinea-soybean systems, Staskawicz et al. (1984) were able to identify the gene responsible for race-specificity, and they also provided evidence that incompatible races play an active role in defence in gene-for-gene systems.
However, recent progress has shed only some light on the understanding of the molecular basis of disease resistance in detail. Day (1984 a) has suggested that the progress in revealing the mechanisms of resistance and specificity has been slow in gene-forgene systems principally because the mechanism of specificity determination is a great deal more complex than the genetic control implies. In addition, Daly (1984) states that the basic problem with current genetic and recognition models is that they are static without the known plasticity of natural disease reaction. The present models of molecular control of resistance seem to be valid to a limited extent in only some gene-for-gene systems, and they probably throw very little light on host-pathogen systems which are controlled by more complex genetic systems.  Parlevliet 1981). Escape mechanisms include various forms such as morphological traits and different timing of flowering (Agrios 1980), which operate before the contact between host and pathogen is established.
Tolerance means that plants are diseased but they suffer only little damage. Resistance to pathogens involves diverse physiological, histological, and biochemical mechanisms which can be effective before the physical contact between host and pathogen or induced after parasitic attack (Russell 1978). Respiration studies (Smedegärd-Petersen andStolen 1980, Uritani andAsahi 1980, Kosuge and Kimpel 1981) suggest that the expression of active defence mechanisms is an energy-requiring biosynthetic process, which is likely to deprive host energy. This suggestion has recently been confirmed by yield experiments (Smedegärd-Petersen 1982) which show a yield reduction in barley after inoculation with avirulent fungi and by bioenergetic calculations made by Mitra and Bhatia (1982). Vanderplank (1963Vanderplank ( , 1968) divided resistance to plant pathogens into two types, horizontal and vertical. According to his definition, resistance is horizontal if its variation is independent of pathogen variation, and vertical if variation in the pathogen is qualitatively associated with the variation in the host. The effect of different types of resistance on disease development according to Vanderplank (1968) is presented in Fig. 1. The magnitude of host resistance on pathogen development can range from very small to very large (Fry 1982). Monogenic resistance, hypersensitive type, often gives complete protection against specific races of the pathogen, while quantitative resistance (terms horizontal and partial are also used) is characterized by a reduced rate of epidemic development resulting from factors that reduce infection efficiency, extend the latent period, and reduce sporulation (Umaerus 1970, Zadoks 1971, Parlevliet 1979, Kranz 1983). Thus it is obvious that the detection of quantitative resistance is difficult mainly because it is easily masked by environmental factors and the effect of growth stage of the plant (Parlevliet 1979). In addition, it is difficult to accurately assess the practical value of quantitative resistance in crop production (Leonard and Mundt 1984).
The widescale use of monogenic resistance, following the gene-for-gene pattern, in a crop imposes extreme selection pressure on the pathogen population that can lead to a rapid build-up of pathogen races with genes for virulence that match the resistance genes used in the cultivars (Vanderplank 1968, Leonard 1977. The recent history concerning breakdown of host resistance based on single major genes is well documented as far as the biotrophic pathogens, for example stem rust of wheat and powdery mildew of barley, are concerned (Wolfe and SCHWARZBACH 1978, JORGENSEN 1983, LeO-nard 1984. Particularly after Vanderplank (1963Vanderplank ( , 1968 had introduced his ideas of horizontal resistance and its better durability over vertical resistance, breeders widely began to incorporate quantitative resistance into breeding lines. For a long time it was assumed (Vanderplank 1963, Nelson 1978) that it is very difficult for pathogens to adapt to polygenic resistance. However, the better durability of horizontal resistance has been questioned in later studies (Caten 1974, Parlevliet 1981, Hwang and which indicate that significant hostcultivar and pathogen-isolate interactions have been found. Further, there are a number of cases of host resistance based on single major genes which have given relatively long lasting disease protection (Russell 1978, Fry 1982, Johnson 1983, Parlevliet 1983). Consequently, it is frequently proposed (Habgood and Clifford 1981, Parlevliet 1981Parlevliet , 1983) that single genes can give long lasting protection against many necrotrophic and soil-borne fungi and some viruses, while the protection is likely to be short-lived against biotrophic fungi. However, the durability of a cultivar's resistance depends on many factors, such as the rate of pathogen reproduction, the number of generations per year, the efficiency of genetic recombination in the pathogen, as well as factors that affect selection pressures like the intensity of crop production and the popularity of the cultivars in the fields (Jenns et al. 1982). Moreover, Kiyosawa (1982) and Leonard (1984) have shown that durability does not depend on the genetic background of the cultivar only, but also on weather conditions. Northern climatic conditions affect the ecological aspects of host-parasite interactions in several ways (Karjalainen 1985 a). For example, field crop production in marginal areas is characterized by a limited number of pests and diseases because of cold climate, and severe epidemics of many pathogens are comparatively rare. In addition, short-season field production implies that many biotrophic pathogens, for example, powdery mildew of barley, produce a relatively limited number of reproduction generations because they attack cereals at the later stages of crop development and thus the effective time for reproduction is often very short. Consequently, it has been suggested (Karjalainen 1985 a) that in northern marginal areas resistance based on single genes can in many cases give long lasting disease protection.
3. Septoria nodorum on wheat: a polygenic host-pathogen system 3.1. Description, host range and distribution of the disease Description Septoria nodorum (Berk) Berk, (perfect stage: Leptosphaeria nodorum Muller) was first described by Berkeley in 1885. First he designated it the name Nepazea nodorum, but later corrected it to belong to the genus Septoria (Shipton et al. 1971). The identification of S. nodorum is most reliably done according to conidial morphology (Richardson and Noble 1970). The conidia have I -3 1 -3 septa, and they are usually no longer than 25 /un, however, the variability in conidia size and shape as well as colony morphology is large and may be affected by environment (King et al. 1983). Harrower (1976) has detected that S. nodorum can produce through micropycnidia spores that are smaller than the normal ones.
The symptoms of S. nodorum disease are difficult to distinguish and identify, especially in the early stages of infection when pycnidia, sporulating bodies, are absent, and it can easily be confused with natural senescence (Shipton et al. 1971, Baker 1978. S. nodorum causes lesions on the leaves and stems, and they appear as linear, light brown spots with a yellow margin (Figs 2,3). There is a wide variation in symptom expression and it depends on isolates, host cultivars, and environmental factors. Later in summer the disease also infects the ears on which brown spots appear (Fig. 4).

Host range
S. nodorum has been demonstrated to infect a number of grasses and cereals (Smedegärd-Petersen 1974, Mäkelä 1975, Harrower 1977. Isolates from wheat can infect alternative hosts (Shearer and Zadoks 1972 a), and extensive data , Fitzgerald and Cooke 1982, Krupinsky 1982 suggest changes in virulence after passage through an alternative host.
Flowever, studies on physiological specialization have lead to contradictory results. For example, Smedegärd-Petersen (1974) and Confer (1984, Confer andYoomans 1983) have demonstrated that isolates from wheat are characteristically virulent to wheat and avirulent to barley. However, Holmes and Colhoon (1971) and Martin and Cooke (1979) report that in Britain S. nodorum can be pathogenic to barley and cross infection is possible between isolates of barley and wheat.
Hitherto, there is no convincing evidence suggesting significant isolate-cultivar interactions, and thus physiological specialization in S. nodorum to wheat appears to be nonrace-specific (Allingham and Jackson 1981, King et al. 1983). Distribution S. nodorum is an important pathogen in many parts of Europe, North and South America, Africa, Asia, Australia, and New Zealand (Shipton et al. 1971, Eyal 1981, King et al. 1983. Available data of worldwide crop losses caused by S. nodorum are 19 Figs 2-4. Symptoms of the disease caused by Seploria nodorum. Lesions on leaves (Fig. 2), stems (Fig. 3), and heads ( Fig. 4 a, b). lacking. However, reports from various countries suggest yield reductions of up to 65 %, and in most cases of 25 % (Baker 1978, Eyal 1981, King et al. 1983. The disease has become increasingly important during the last ten years in many wheat growing areas of cool damp summers and mild winters. The reason for this lies in the increasing area of intensive wheat cultivation with dwarf cultivars which appear to be susceptible to S. nodorum (Saari and Wilcoxson 1974, Baker 1978, Eyal 1981). Further, increasing use of non-ploughing techniques, which results in less efficient stubble cultivations, can promote disease build-up (Baker 1978).

Primary sources of infection
Infected seed Seed has been found to be an important source of inoculum in many wheat-growing areas (Shipton et al. 1971, Shaner 1981, King et al. 1983). The role of seedborne inoculum as the source of infection depends greatly on the longevity of inoculum on the seed. Kruger and Hoffman (1978) found that the amount of S. nodorum on seeds depends on storage temperatures so that the rate of decline increased with increasing temperature. Further, they noted that the seed was free of S. nodorum after two year's storage. However, recent studies (Confer 1981, Babadoost andHebert 1984) indicate that S. nodorum can survive and remain virulent in storage for more than two years.
The seedborne Septoria first infects the growing coleoptile from hyphae growing up its outer surface (Baker 1971). Pycnidia on the coleoptile may provide secondary inoculum for infection of the leaves (Cooke and Fozzard 1973, Hewett 1975). Coleoptile infection appears to be more severe in soil with a high moisture holding capacity (Holmes and Colhoun 1975). Recent studies by Babadoost and Hebert (1984) suggest that there is not necessarily a clear correlation between percent seed infection and percent germina-tion. In addition, high seed infection is not always directly correlated with the severity of disease symptoms (Cunfer and Johnson 1981).

Plant debris
S. nodorum is known to survive for considerable lengths of time on plant debris from previous wheat crops, and it can also overwinter there (Brokenshire 1975, Holmes and Colhoun 1975, Harris 1979. Survival of the fungus depends on the temperature and humidity of the debris so that if the debris is alternatively wet and dry, new pycnidia form abundantly (Scharen 1964, Harrower 1974. Spores are released from pycnidia in wheat debris during rain and dispersed by splash (Faulkner and Colhoun 1976). Under favourable conditions, pycnidiospore production may continue for several months. Limited data concerning herbicide experiments (Harris 1979) suggest that herbicides do not seem to affect the survival of Septoria on treated straw. The perfect stages, which occur rarely, can also be important sources of inoculum (Sanderson and Hampton 1978). Perithecia of L. nodorum develope on straw and release ascospores during the season (King et al. 1983). Alternative hosts such as grasses and other cereals on which S. nodorum can overwinter may also provide inoculum reserves for disease build-up (Harrower 1977

Factors affecting infection and symptom expression
Environmental factors such as humidity and temperature greatly effect the whole lifecycle of S. nodorum (Shaner 1981). Release of pycnidiospores occurs when free water is present on the infected tissue or when atmospheric humidity is nearly saturated (Shaner 1981, King et al. 1983). Dispersal of spores is connected with periods of rainfall.
It is evident (Jeger et al. 1981 a, King et al. 1983) that only a little amount of rain and a short period is required to cause dispersal of spores. Humidity affects symptom expression to a great extent. For example, the rate of symptom development and the overall severity is closely correlated with high humidity (Scharen 1964, Holmes and Colhoun 1974, Eyal et al. 1977. The duration of latent period of S. nodorum (Shearer and Zadoks 1972 b, Aust and Hau 1981) is also shorter under more humid conditions compared with other moisture treatments. Spores of S. nodorum will germinate in temperatures of s -3 7°C , 5 -37°C, but the optimum is 20-25°C (King et al. 1983). However, the optimum temperature for infection is 18-25°C (Shipton et al. 1971). Some data suggest (Shipton et al. 1971) that at lower light intensities plants seem to be more susceptible than at higher intensities. Furthermore, it is well known (Cooke and Jones 1970) that light quality, for example the amount of nearultraviolet (NUV) light, can promote pycnidium formation in leaves.
The growth stage of the host plant has evidently some influence on symptom expression (Pirson 1960, Shipton et al. 1971), but the results are so far contradictory. Evidence is accumulating Brönnimann 1982, J. Jönsson 1983, pers. comm.) which suggests that wheat is more susceptible at the later stage of development than at the seedling stage.
Fertilization may also have some influence on symptom expression, but very little is known at present. Brönnimann (1968) has found that high nitrogen fertilization and Confer et al. (1980) excessive phosphorous and potassium levels in soils make wheat more susceptible to S. nodorum. Fertilizers as well as chlorcholinchloride (CCC) (Brönnimann 1969) can affect tissue susceptibility directly or indirectly by changing the microclimatic conditions more favourable to disease development.

Control measures
Several strategies can be used to control the disease caused by S. nodorum: crop rota-tion, resistant varieties, seed treatments, and fungicide sprays (King et al. 1983). The effect of crop rotation on reducing the development of S. nodorum is not well understood. Some data (Luke et al. 1983) suggest that one year rotation did not reduce the amount of disease when uninfected seed was used. In addition, Luke et al. (1983) found that two year's rotation did not significantly reduce the amount of disease when infected seed was used. Thus it seems evident that one or two year's rotation has limited value when the percentage of seed infected is high and weather conditions favour disease build-up. However, long term rotations (Eyal 1981) probably have great value in limiting inoculum potential. In addition, ploughing-in of infected stubble (Harrower 1974, Harris 1979 or burning it soon after harvest (Harrower 1974) are likely to reduce inoculum potential for disease development.
Treatment of infected seed with organomercury compounds has been found to be efficient in controlling seedborne S. nodorum (Shipton et al. 1971). However, the control has not always been complete and seed treatment has had only minor effects on the subsequent development of leaf attack (Bateman 1977, Obst 1977. Fungicide sprays are now widely used to control S. nodorum in many wheat-growing areas. There are several fungicides available (King et al. 1983) and many experiments (Cook 1977, Kucharek 1977, Eyal 1981 suggest profitable increases in yield. The benefit of chemical control depends on various factors, such as the price of the fungicide, the price of wheat, the amount of disease, and cultivar susceptibility. Recently, some studies Cook 1979, Menz andWebster 1981) have clarified the economic returns and costs when controlling S. nodorum with fungicides. The proper timing of fungicide sprays is of crucial importance for an effective strategy (King et al. 1983). Cook (1977) found that fungicide application between flag leaf emergence and ear emergence resulted in considerable yield benefit. Early sprays seem to give good disease control only if weather subsequently favours disease development (King et al. 1983). Thus, it appears that forecasting disease incidence is required for optimal strategies of chemical control.
The use of cultivar mixtures in cereal production has received much attention in recent years. There is a considerable body of data (Browning and Frey 1969, Ayanru and Browning 1977, Wolfe and Barrett 1982, Munk 1983) which shows that using host populations heterogeneous for resistance genes can control many fungal parasites ef-fectively. Recently, Jeger et al. (1981 b, c) demonstrated theoretically and experimentally that intraspecific mixtures of spring wheat can reduce the disease development of S. nodorum. This is supported by my recent experiments (1985 b) indicating that the disease level in mixed stand was less than the arithmetic mean of the disease amount in pure stands. However, it was found that although the mixture reduced disease development, it did not seem to prevent yield losses under high disease conditions (Karja- 4. The aim of the present experimental study Some years ago Mäkelä (1975Mäkelä ( , 1977 made extensive surveys of the occurrence of Septoria species in Finland and found that S. nodorum was one of the major diseases of wheat. However, no estimates of the economical importance of this disease are available, but practical experience suggests that in rainy years S. nodorum has probably caused significant yield reduction in wheat crops. Foreign experience  suggests that resistant sources are available of the wheat germplasm, although cultivars which are immune to S. nodorum have not been found (Baker 1978).
Despite the polygenic nature of resistance, significant progress in breeding for S. nodorum resistance has been achieved (Brönnimann and Fossati 1977, Scott and Benedikz 1977, Scharen and Eyal 1980 and genetic resistance appears to be the most feasible method of controlling the disease in terms of economic advantage over other control methods (Doodson 1981).
Although some aspects, notably the epidemiology, of this host-pathogen interaction are extensively investigated (e.g. King et al. 1983), little is known about the mechanisms of resistance, and for example, the ultrastructural nature of S. nodorum-wheat interaction is largely unknown. Furthermore, relatively little is also known about the value of seedling plant tests to predict the field reactions of cultivars to S. nodorum at the adult plant stage. In Finland breeding wheats for resistance to S. nodorum is just starting, and practical experience is limited. Hence, in 1979, a research programme was started which was mainly aimed at providing relevant basic information for wheat breeders and to strengthen breeding work for resistance to S. nodorum. More specifically, the aims of the investigation were: to reveal some ultrastructural interactions between S. nodorum and spring wheat to study the effects of S. nodorum on yield and yield components of spring wheat to study the inheritance of resistance with applications for selection strategies to evaluate seedling plant and mature plant screening techniques This experimental investigation is based on the data presented in this study and on the following articles which are referred to by their Roman numerals. I Karjalainen, R. & Lounatmaa, K. 1984

Materials and methods
Production of inoculum S. nodorum isolates were collected from infected wheat leaves in southern Finland. For mass production the most virulent isolates were selected and grown on oatmeal agar in petri dishes under near-ultraviolet irradiation with cool-white fluorescent lamps for 7-lo days at 19-22°C. The NUV-light has been shown to stimulate the pycnidia formation of S. nodorum (Cooke and Jones 1970).
In order to obtain densely sporulating cultures some spore transfers were made. However, old cultures were avoided because of decreasing aggressivity with increased age (Scharen and Krupinsky 1973). The plates were flooded with a small amount of sterile water, and the cirrhi containing pycnidiospores were removed by gently rubbing the agar surface with a glass rod. The conidial suspension was filtered through two layers of cheesecloth, and spore concentration of 10 6 conidia/ml, but also higher concentrations, was used, as suggested by earlier investigations (Krupinsky 1976, Eyal andScharen 1977). To obtain better adherence of the conidia to wheat leaves 'Tween 20' surfacant was used.

Electron microscopy
For scanning electron microscopy, two days after inoculation, sections of infected leaves of about 5 mm were dehydrated after ethanol series by critical point method in an Aminco apparatus before evaporation with gold in a Balzers apparatus Micro-BA3. The scanning electron micrographs were taken with a Jeol JSM-U3 electron microscope operating at 12 kV.
For thin sections, samples were taken 3,6, and 9 days after inoculation, and fixed in 2.5 % glutaraldehyde in sodium phosphate buffer (0.1 M pH 7.2) for 2 h. The samples were then washed four times with phosphate buffer and postfixed for 2 h with buffered 1 % osmium tetroxide at room temperature and dehydrated in a graded ethanol series. Thin sections were obtained from 'Epon 812' -embedded samples and stained with uranyl acetate and lead citrate. For freeze-fracturing the samples were frozen in liquid Freon in the presence of 30 % (v/v) glycerol and fractured at -l2O°C. Methods of freeze-etching techniques used in Septoria studies have been described by Karjalainen and Lounatmaa (1984 I). Transmission electron micrographs were taken using a Jeol JEM-1008 transmission electron microscope operating at 80 kV.

Field experiments
The field experiments were carried out at the University Farm of Viikki, except for experiment IV, which was carried out at the Anttila Field Station of the Hankkija Plant Breeding Institute. The data of crop loss studies reported in this paper were based on trials carried out in 1984. Cultivars Tähti and Kadett were used in the experiments. In the first part of the study normal breeding plots (10 m 2) with three replications arranged in a randomized block design were used. Standard fertilization and herbicide treatments were applied. In the second part of the study large plots (20 m 2) with four replications arranged in a randomized block design were used. Only 10 m 2 area was harvested for yield measurements so that edge effects were avoided. The plots were inoculated with spore suspensions (10 6 conidia/ml) of S. nodorum three times, starting before flag leaf emergence and repeating at three days intervals. Yield comparisons were made in relation to uninoculated control plots.
The data of the field screening method reported in this paper based on the procedure described by Scott and Benedikz (1977). Small plots (0.3 m 2) with six replications arranged in a randomized block design were used. Inoculum was applied onto plants by spraying (by a spray gun, 10 6 conidia/ml) soon after ear emergence. After inoculation the plants were kept damp by covering them with plastic bags for 48 hours, followed by daily irrigation starting one week after inoculation. The assessment of disease severity was made by estimating the percentage area covered by lesions 7,9, and 12 days after inoculation.
The data of inheritance studies reported in this paper based on the F 2 progenies derived from crosses Hja 21600 X 80325, Hja 21600 x Cl 12463, Luja X 80325, and Hja 21600 X Cl 12463. The F, generation was grown in the glasshouse, and it produced, by selffertilization, material for F 2 population.
The parents were selected to exhibit a wide variation in height and resistance, Cl 12463 and 80325 being resistant and tall, Luja and Hja 21600 very susceptible and short. Small plots were space-planted with three replications arranged in a randomized block design.
Standard fertilization and herbicide treatments were used. Inoculum was applied onto the plants by spraying before flag leaf emergence. After inoculation the plots were kept damp by covering them with a plastic tent for 48 hours. Irrigation was not necessary because of rainy days. The assessment of disease severity was made by estimating the percentage area covered by lesions. At the same time height was measured.

Seedling plant tests
The detached seedling plant tests reported in this paper based on the method described by Benedikz et al. (1981). Detached spring wheat leaves taken from two leaf stage plants were mounted on benzimidazole agar, and localized inoculum drops of 3/d (10 7 conidia/ml) were placed on the leaves using a microsyringe needle. Ten leaves per petri dishes with three replications were used (the total range 27-30). The assessment of resistance based on the measurement of lesion length B-lo days after inoculation.
The second seedling test reported in this paper based on the simplified attached seedling leaf inoculation method. The seedlings were grown in small pots (0 10 cm) containing fertilized Finnpeat. Approximately ten seeds/pot were used with 8 replications arranged in a randomized block design. The plants were kept in glasshouses at 18-21°C with supplementary light (Philips lamps) providing an 18 h photoperiod. Leaves were inoculated with a spore suspension (10 7 conidia/ml) containing 0.5 ml 'Tween 20' surfacant per 100 ml of suspension. Inoculum was applied by a spray gun at the two-leaf stage. All plants were enclosed within polyethene bags for 5 days to provide high humidity for infection. The measurement of disease severity according to percent disease area based on assessing symptoms 7 and 9 days after inoculation.

Statistical analysis
The percentages of diseased leaf area were transformed using the arc-sin transformation.
Variance analysis for comparing the yield Hja 21600 x 80325 (N = 67) and Hja 21600 between treatments was calculated (Karjalainen et ai. 1983 IV). Spearman's rank correlation, regression analysis, and correlation analysis were computed. In addition, Pathcoefficient analysis was also calculated (IV) according to Li (1975) in order to clarify the direct and indirect effects on yield components caused by S. nodorum. Statistical differences between cultivars are not shown in order to emphasize the relative values and the fact that S. nodorum-wheat system is sensitive to genotype-environmental interactions.
The objective of the present inheritance study was to evaluate the upper limit for narrow sense heritability of total variation in symptom expression. Two crosses were used, x Cl 12463 (N = 106), for calculations of broad sense heritability estimates using a general equation (e.g. Griffiths and Lawes 1978): where h£s = heritability in the broad sense Numerous studies have been carried out of the fine-structural features of many saprophytic fungi (Beckett et al. 1974), but our knowledge of the ultrastructure of many important pathogenic fungi is still lacking (Cole et al. 1979, Griffin 1981. Hitherto, very little is known about the fine-structure of S. nodorum despite the fact that it is a widespread and important pathogen. Recently, Karjalainen and Lounatmaa (1984 I) attempted to describe some finestructural features of S. nodorum using electron microscopic techniques. Transmission electron microscopy (TEM) of thin sections of hyphae revealed such cell organelles as the nucleus and mitochondrion (Figs 5,6).
Freeze-etching techniques (Branton 1966) appear to be a powerful method of revealing unique features of membrane structures of bacteria and of fungi (Griffiths 1971, Cole et al. 1979. In freeze-etch replicas of crossfractured spores of S. nodorum (Fig. 7) such organelles as the nucleus with nuclear pores and a large lipid body are evident.
Recently, much effort has been concentrated on trying to understand cellular interactions between plants and pathogens. Available data (Keen 1982) suggest that in many cases the early phases of the interaction between plant and fungi are mediated through molecules via cell wall contact. Therefore, the knowledge of cell wall surface structures of plant pathogenic fungi might give important information which could help to clarify the early stages of plant-pathogen interaction (Rohringer et al. 1982). In connection with the study on the ultrastructure of the interaction between S. nodorum and wheat leaves, a preliminary study was made to clarify some ultrastructural features of the cell walls of S. nodorum using different electron microscopic techniques (Karjalainen and Lounatmaa 1984 I). Thin sections of hyphae and pycnidiospores suggest the wall to be composed of similar structures, an outer electron-dense layer and an inner transparent layer (I). The freeze-etching technique also revealed some additional information on the surface features of the outer cell wall layer.
The etched surface appears to be composed of thin fibrillar and globular material of different size (I).
The thin material revealed by freeze-etching is probably fibrillar material, since chitinous residues of fungal walls are often fibrillar when examined by electron microscopy (Jones and Johnson 1970). Data based on many conidial fungi (Cole et al. 1979, Schneider andWardrop 1979) suggest that the fibrillar material is often composed of glycoprotein and chitin polysaccharide. The precise role of this material is not well understood (Cole et al. 1979). However, it is suggested (I) that it may be important in the attachment of a hypha to a host cell, but this remains to be determined after detailed chemical analyses of surface structures. In addition, it might be necessary to use genetic mutants of isolates in trying to find sol-   utions to questions of the primary role of certain surface components in host-pathogen interactions. Recently, Sequeira (1984) provided an experimental framework using mutants of Pseudomonas bacteria as a model to study the role of surface components in early phases of plant-bacterium interaction.  (Aist 1976). On the contrary, many necrotrophic pathogens, including fungi and bacteria, are capable of producing a wide range of cell wall degrading enzymes matching the diverse polymers in plant cell walls (Cooper and Wood 1980, Bishop and Cooper 1983, Cooper 1983). There are extensive data of the ultrastructural aspects of penetration by biotrophic pathogens (e.g. Ingram et al. 1976, Maclean andTommerup 1979). However, the ultrastructural nature of penetration is known only in a few necrotrophic interactions (Politis 1976, Wheeler 1977, Van Caeseele and Grumbles 1979, Bishop and Cooper 1983, Keon and Hargreaves 1983). It has been previously shown (Bird and Ride 1981) that most S. nodorum spores germinate within 6-B h after inoculation, and penetration is observed about 10 h after inoculation. However, there are no detailed data of the ultrastructure of penetration, which has partly been explained (Baker and Smith 1978) by difficulties in staining the hyphae within the leaf. Karjalainen and Lounatmaa (1985 II) made an attempt to clarify the ultrastructural nature of penetration. Scanning electron microscopic (SEM) observations on the early stages of S. nodorum infection indicate (Figs 8,9) spore germination and hyphal ramification over the surface of wheat leaves. SEM studies (Baker and Smith 1978) have previously shown that S. nodorum conidia germinate, produce appressoria, and penetrate into host epidermal cells. In order to clarify the penetration event, it is necessary to make thin sections after inoculation for transmission electron microscopic observations. TEM studies (II) showed that S. nodorum appears to grow in both the intercellular spaces (Fig. 10) and the intracellular spaces (II). The hyphae were often surrounded by amorphous material (Fig. 11), which seems to affect their attachment to host cell wall. In many cases it was evident that the hyphae were closely associated with host cell wall. Penetration appears to take place directly through intact cell walls (Figs 12, 13). There were some observations (II) which suggest that the host cell wall within the area around the penetration peg was more diffuse than that of adjacent areas.
The study by Karjalainen and Lounatmaa (1985 II) clearly indicates that S. nodorum penetrates wheat leaves directly through the epidermal cell walls. There has been a long controversy among scientists whether cuticular penetration is by mechanical force or by enzymic hydrolysis (Verhoeff 1980, Cooper 1983, Kolattukudy and Köller 1983. The data presented in our study do not produce any definite proof for either way of penetration. However, limited ultrastructural observations (II) on the electron lucent dissolution of wall material in front of the penetration peg as well as observations that part of the host cell wall in contact with the penetration peg was more electrondense than other parts, provide some clues for the possible role of enzymic digestion in penetration, as has been suggested on similar grounds in some other studies (McKeen 1974, Wheeler 1977, Bishop and Cooper 1983. So far, there is no evidence of such cell wall degrading enzymes in vivo, although some data (Baker 1969) show that S. nodorum produces in culture enzymes such as pectic methylesterase and polygalacturonase. Thus attempts using modern immuno-cytochemical     techniques (Shaykh et al. 1977) to localize specific enzymes during infection might provide some evidence for their involvement in penetration.

Nature of cellular resistance reactions
It is well known that the thickening on the inner surface of cell walls at the site of penetration by fungi is a common reaction of plant cells to fungal invasion (Aist 1977, Israel et al. 1980, Beckman et al. 1982, Allen and Fried 1983. This type of defence reaction is probably responsible for the unsuccessful penetrations of many nonpathogenic and pathogenic fungi, and it might explain why only a small number of micro-organisms are pathogenic. For example, in leaves of Gramineae, cell wall alterations, appositional wall formation, and papilla formation are frequently associated with unsuccessful penetration of the epidermal cells by pathogenic fungi (Politis 1976, Ride 1978, and hence their important role in resistance has been proposed (Ride 1978, Sherwood andVance 1982).
Previous studies on wheat-S. nodorum interaction Smith 1978, Bird andRide 1981) throw some light on the possible mechanisms of resistance. For example, Bird and Ride (1981) reported that germ tubes were shorter on more resistant varieties, and on this basis cultivar differences can be detected as early as 6-B h after inoculation. Using light and stereoscan microscopy Baker and Smith (1978) were able to demonstrate differences in hyphal development between resistant and susceptible varieties. However, the biochemical basis of these differences is largely unknown. Furthermore, Baker and Smith (1977) made experiments in order to test the role of antifungal compounds involved in resistance, but no convincing evidence has yet been published. Bird and Ride (1981) demonstrated that lignification can play an important role in restricting fungal development. However, Baker and Smith (1978) did not find evidence which would support the role of lignin as an essential component of resistance.
In order to obtain information on the cellular nature of resistance, thin sections of highly resistant wheat leaves were made after inoculation for transmission electron microscopy (Karjalainen 1985 III). The results suggest that S. nodorum grows slowly and ramifies over the leaf surface searching for a suitably thin place for penetration (Figs 12,14). It appears (III) that the majority of penetration attempts fails, which seems to be associated with cell wall alterations and the formation of electron-dense material (papillae) beneath the point of penetration (Figs 15, 16). These observations support the idea of Bird and Ride (1981) that the failure of S. nodorum to penetrate epidermal cells is associated with the deposition of new walllike material (papillae) and the alterations in the upper epidermal walls and adjacent lateral walls. The composition of these appositions and papillae is not known (Aist 1976), but there is some evidence for wheat Pearce 1979, Ride 1983) suggesting that they contain lignin-like material. Thus it is possible that the depositions and papillae found in this work (11, III) are lignin-like material. Bird and Ride (1981) have suggested that lignification is an important factor preventing hyphal development, and it is more efficient on resistant cultivars. It was also apparent from the ultrastructural data of highly resistant interactions (III) that cell wall alterations and papilla formation were frequent, which probably explains why it was very difficult to find successful penetrations. Increasing evidence has recently accumulated which suggests that lignification may play an important role in general defence of plants against pathogen attacks , Hammerschmidt and KuC 1982, Sherwood and Vance 1982, Hammerschmidt et al. 1984. Ride (1980Ride ( , 1983    35 tion might provide a mechanical barrier to hyphae, restricting the diffusion of water and nutrients or toxins between host and parasite, impacting hyphal walls and reducing their capacity to elongate, or protecting host wall polymers from enzyme degrading. It is probable that in the wheat-S. nodorum interaction lignification might function as a mechanical barrier to hyphae and thus restrict their growth. Furthermore, it is also possible that lignified cells might to some extent hinder host cell degradation by fungal cell wall enzymes, assuming that enzymic hydrolysis has a decisive role in penetration. In addition, lignification may alter the toxin sensitivity of the host, since there are some reports on non-specific toxins isolated from S. nodorum (Bousquet andSkajennikoff 1974, Kent andStrobel 1976). However, it is very probable that mechanisms of resistance to S. nodorum are complicated because this host-pathogen interaction is under polygenic control and is non-race-specific, and generally such systems involve several mechanisms of resistance (Touzfi and Esquerre-Tugaye 1982). Hence lignification, cell wall alterations and papilla formation may account for some part of the defence systems, but many other factors may be involved in resistance as well. Since the mechanism is complex, it is also evident that the expression of different resistance mechanisms may be effective at different stages of the infection process (III).

Pathological alterations in host ultrastructure
The penetration of the plant by a pathogen leads to diverse changes in host physiology, such as increasing rate of respiration (Uritani andAsahi 1980, Kosuge andKimpel 1981), decreased rate of photosynthesis (Buchanan et al. 1981), and changes in secondary metabolism (Friend 1981). It is widely accepted (Aist 1976, Cooper 1981) that biotrophic parasites cause minimal changes in host metabolism in relation to necrotrophs which frequently cause severe disruption in host cytoplasm. An important observation is that necrotrophs are capable of causing drastic changes in host metabolism, often in advance or during penetration (Wheeler 1977, Cooper 1981, while biotrophs cause prominent changes usually at the later stages of pathogenesis. Plant pathogens are also known to cause changes in host ultrastructure (Coffey et al. 1972, White et al. 1973, Heath 1974, Jones et al. 1975, Cooper 1981. Figures 17 and   18 show that S. nodorum also alters the ultrastructure of wheat leaves. For example, deformation and disintegration of chloroplast grana and lamellae are evident, as well as an increased number of plastoglobuli (Figs 19,20). Similar observations of some other necrotrophic interactions have been found (Jones and Ayres 1974, Cooper 1981). These observations fit well the physiological studies (Scharen and Taylor 1968, Scharen and Krupinsky 1970 which indicate that photosynthesis is strongly inhibited due to S. nodorum infection. Thus electron microscopic data can provide additional evidence for the observation (Scharen and Taylor 1968) that S. nodorum reduces wheat photosynthesis and may in this way reduce grain yield.

Effects of S. nodorum on wheat yield and yield components
The impact of plant pathogens as yield reducing factors has been known for a long time, but the exact amount of crop losses caused by them is difficult to determine. Generally, the amount of pathogen-induced yield reduction depends on the incidence and severity of pathogen attacks, on their influence on plant physiological processes, and the significance of these processes for yield development (James andTeng 1979, Gaunt 1980).
Estimates of national crop losses due to glume blotch disease are few. However, in England and Wales routine surveys of leaf    diseases have provided some estimates for the economic importance of crop losses (King 1977). For example, surveys of winter wheat in 1970 s (King 1977) showed that Septoria is the second most important leaf disease, and Doodson (1981) estimated losses over the ten year period to be about £ 82 300 000.
There are no extensive surveys of glume blotch disease in Scandinavian countries. However, it has been known for a long time (Mäkelä 1975) (Karjalainen et ai. 1983 IV) and at the University Farm of Helsinki University to provide a sound basis for the economic evaluation of S. nodorum and its significance for plant breeding.
The aim of the study carried out at the Hankkija Plant Breeding Institute was to clarify how low infection level affects the yield of the moderately susceptible cultivar Hankkija's Taava. Therefore, low concentration S. nodorum spore suspension was sprayed onto the plots at heading stage, and the yield response was compared with uninoculated control plots (IV). The results indicated that low infection pressure reduced grain yield by 10 % and 1000-grain weight by 14 % ( Table 2). Data of single tillers (IV) suggest that S. nodorum strongly reduced grain number per ear and to some extent spikelet number per ear. Consequently, this study suggests that even under moderate level of infection S. nodorum is capable of markedly reducing yield. However, as Table 2 shows, the yield level in this experiment was relatively low, which was due to unfavourable weather conditions (IV), and hence wide generalizations of these results should be The data from these studies are in accord with several previous investigations (Brönnimann 1968, Spierz 1973, Nelson et al. 1976) which indicate that a moderate or high infection level at a later developmental stage causes severe yield reductions mostly due to lowered grain weight. It is known (e.g. Evans 1975) that the grain yield of wheat is determined mainly by carbohydrates produced after ear emergence by the flag leaf, head and penduncle, and the development of these organs also depends on assimilates translocated from the leaves below the flag leaf.
Consequently, S. nodorum may affect yield, for example, indirectly by damaging lower leaves, thus reducing the number of sites for assimilate depositions (Scharen and Taylor 1968) and directly by damaging the flag leaf with the consequence of reduction in overall photosynthesis and assimilate accumulation.
Leaf pathogens are known to reduce the rate of photosynthesis (Buchanan et al. 1981). There are some data (Scharen andTaylor 1968, Scharen andKrupinsky 1969) which show that S. nodorum caused a strong reduction in the photosynthetic rate, but the reason for this is not known. Although there is some evidence (Buchanan et al. 1981) that infection can reduce the number of chloroplasts, there is no direct evidence that this has any major effect on the photosynthetic rate. On the other hand, some data of physiological studies (Magyarosy and Malkin 1978) suggest that disease infection can alter the partial reactions of photosynthesis, such as photophosphorylation, electron transport chain, and ribulose-1,5bisphosphate carboxylase activity. Very recently, Walters and Ayres (1984) provided evidence that ribulose-1,5-bisphosphate carboxylase (RußPase) of barley, a key enzyme in photosynthesis, is inhibited due to mildew infection. However, as there are no data of similar studies on Septoria infected wheat leaves, it remains to be shown whether the reduction in photosynthesis can be explained on similar grounds.
It has been suggested (Brönnimann 1968) that yield loss caused by 5. nodorum may be partly explained by the fact that infection interferes with translocation. S. nodorum infection may enhance translocation because it appears to accelerate the onset of senescence (Spierz 1973). However, there is no evidence that the interference with translocation by infection might explain any important part of yield losses. For example, Wafford and Whitbread (1976) demonstrated that despite extensive lesions on the leaves, S. nodorum infection appeared to alter the export of assimilates from a leaf only to a small extent, and obviously this also has little effect on the patterns of assimilate distribution. Scharen et al. (1975) also found that axial lesions by S. nodorum did not interfere with the translocation of assimilates.
Extensive data of cereal foliage diseases indicate (James and Teng 1979, Teng and Gaunt 1980-81, Carver and Griffiths 1981 that the disease severity and the timing of the attack determine which yield components are most affected. Generally, early attacks mainly affect the number of fertile tillers and grains per ear and to a less extent grain size, while late attacks mainly reduce grain size. S. nodorum is capable of attacking wheat at all stages of growth. It is suggested (Baker 1978) that early attacks stunt the plant and thus disturb growth, but very little is known about the significance of early infection to yield reduction. A number of studies have led to the idea that late attacks are most damaging. The present study confirms this idea since inoculation at the later phase of development caused a strong reduction in yield. Particularly late infection had great influence on 1000-grain weight (Fig. 21) (IV), which has been confirmed by a number of previous studies (Jones and Odebunmi 1971, Jones and Rowling 1976, Wafford and Whitbread 1978. This can be understood easily because it is known that photosynthesis after post-anthesis can greatly contribute to grain-filling, and Lupton (1969), for example, has shown that the ear receives assimilates almost exclusively from the flag leaf. Hence damages at the upper leaves have strong effect on grain weight. Karjalainen et ai. (1983 IV) demonstrate that heavy infection seems to reduce all yield components, but they also suggest that in some cases the reduction in grain weight can be compensated by more grains being filled per ear in some tillers. This observation is in accord with those of Jones and Rowling (1976) and Wafford and Whitbread (1978), who also found compensation effects. However, the compensatory mechanism might be too weak after severe infection and cannot prevent yield reductions.
It has been observed (Brönnimann 1968, Scharen and Taylor 1968, Obst 1977) that S. nodorum is capable of causing yield reductions that are not always correlated with symptoms on which disease assessments are based. Karjalainen et ai. (1983 IV) also found some evidence that disease severity was not always correlated with yield loss, and even susceptible cultivars can possess some tolerance to attacks by S. nodorum (Brönnimann 1968). It seems, however, difficult to explain why in some cases lower infection causes comparatively heavy yield reductions. It has been suggested that S. nodorum can cause yield reductions without causing symptoms (Obst 1977, Bannon 1978. Further, Brönnimann (1968) and King et al. (1983) have suggested that yield damages may be partly due to the effect of a toxin.
There is some information on toxins purified from S. nodorum cultures. A phytotoxin, septorin, produced in vitro has been implicated in causing changes in the respiratory activity of wheat mitochondria (Bousquet et al. 1980). Ochracine, the other toxin purified from S. nodorum has been found to inhibit the net assimilation rate of wheat (Bousquet et al. 1980). Consequently, it has been suggested (King et al. 1983) that the toxin effect might to some extent account for yield losses at low infection levels. However, available data are too limited to draw any firm conclusion about the role of toxins in the yield losses caused by S. nodorum.

6.3.
Genetic nature of resistance 6.3.1. Inheritance of resistance Genetic progress in plant breeding for disease resistance depends on several factors, such as the availability of resistance sources, appropriate screening techniques, and the genetic nature of the host-pathogen interaction. Previous studies (Laubscher et al. 1966, Nelson 1980, Mullaney et al. 1982, Scott et al. 1982 provide evidence for polygenic control of host reaction to S. nodorum both at the seedling plant stage and at the mature plant stage. However, Kleuer et al. (1977) have shown a single dominant gene to give simply inherited monogenic resistance at the seedling plant stage. Hitherto, however, there is not enough evidence for the idea that single genes alone can give an easily detectable level of resistance at the mature plant stage (Scharen andEyal 1980, Scott et al. 1982 concluded that in these particular crosses genetic variation is rather low and environmental variation accounts for a major part of the total variation in symptom expression.
Consequently, it follows that reliable selection in F 2 populations greatly depends on efficient screening techniques to detect slight differences between plants.
In the second investigation (Table 3) the  Scharen and Eyal (1983) suggesting that highly resistant cultivars may be governed by major R-genes.
Hence, selection for resistance to S. nodorum in F 2 populations based on crosses involving highly resistant parents is expected to be rather rapid because the heredity component of variation is large enough to permit efficient selection. Table 3 shows that the estimates of heritability values vary largely.
Most of these studies indicate a moderate level of heritability, and some reveal very high levels of genetic variation (Aastveit 1982). On the other hand, Scott et al. (1982) have suggested that in most of their winter wheat crosses the heritabilities were low and the standard errors high. One reason for this variation may be that the experiments have been carried out at different growth stages and under varying environmental conditions using different inoculation techniques. There is a considerable body of evidence (Scharen and Eyal 1980, Fried and Brönnimann 1982, Mullaney et al. 1982) that this host-pathogen interaction is not only sensitive to inoculation differences but also to environmental changes which may mask genetic differences.
Available data of genetic studies shed some light on practical wheat breeding strategies against S. nodorum. It is obvious from the results of the present investigation and the other data (Table 3) that the genetic variation in symptom expression is large enough to permit reliable grounds for selection work.
In addition, Mullaney et al. (1982) have recently shown that the genetic control of leaf resistance can be mainly explained by additive gene effects. Similar results have been obtained from diallel experiments by Nelson (1980) and Nelson and Gates (1982).
Brönnimann (1975) showed that tolerance, measured as the loss of 1000-kernel weight is polygenically determined and controlled by additive gene action. Nelson (1980) found that general combining ability effects were highly significant, but specific combining ability effects were observed as well, indicating non-additive gene action in some specific crosses. Thus, increasing data is accumulating which shows that the resistance of wheat to S. nodorum is mainly controlled by additively acting genes, which means that the breeder may try to improve the level of resistance by accumulating desired genes into populations. Thus transgression breeding may be one way of improving the level of resistance of wheat to S. nodorum. 6.3.2. Association of resistance with agronomic traits Advances in breeding for disease resistance have sometimes been hampered by the association of resistance with agronomically undesirable characteristics (Simmonds 1979). There are some reports which suggest that wheat resistance to S. nodorum is positively associated with late maturation time (Scott 1973, Eyal 1981, Scott et al. 1982, Karjalainen et al. 1983) and tallness (Hope 1957, Brönnimann 1969, Scott et al. 1982. The association between tallness and resistance has been explained (Scharen 1964) by the idea that infection usually starts at the base of the plants and moves upwards by splash dispersal of the conidia. Hence short cultivars become generally more severely infected than tall ones because their tissue is nearer to the source of inoculum (Fried and Brönnimann 1982). It is also apparent that the canopy of short cultivars remains wet longer than that of tall ones, which may provide favourable conditions for disease development (Karjalainen 1984).
It is important for wheat breeders to know whether this association between tallness and disease severity is also clear in segregation generations, since the general aim of wheat breeding is a moderately short straw which guarantees a reasonable level of lodging resistance. In order to clarify this question some crosses between susceptible short cultivars and lines (Hja 21600 and Luja) and highly resistant tall lines (80325 and Cl 12463) were carried out. The results clearly indicate (Fig. 22) that susceptibility is significantly negatively correlated with tallness, the coefficients ranging from -0.58 to -0.63 (p < 0.001). Consequently, it is evident that in these crosses a great part of the variation in leaf resistance was associated with plant height. Hence wheat breeding for resistance to S. nodorum is faced with difficult problems, if short straw, as it normally is, is an important breeding objective. Rigorous selection in F 2 populations for resistance is likely to move populations towards increased tallness. The strength of the association between tallness and resistance depends on the genetic nature of the association. Scott et al. (1982) carried out extensive studies to clarify the genetic nature of this association. Their data suggest that the association between plant height and resistance is mainly due to pleiotropy rather than linkage. Thus, one or more genes that promote short straw may also promote susceptibility. However, Scott et al. (1982) also found that there was an important part of the variation that was independent of height. The present data support this idea since there was a considerable level of resistance in medium or short straw plants (Fig. 23).
Significant negative association between earliness and resistance is evident from the data based on cultivar trials (Fig. 24) thus supporting the previous studies (Fig. 25) Karjalainen et ai. 1983). This association between plant height and resistance since earliness is a major criterium (Mukula et ai. Although the present study and other data suggest that resistance to S. nodorum is often  associated with late maturation time and long straw, there are, however, many deviations. Practical evidence (e.g. Scott et al. 1982) and the present cultivar trials indicate that moderately short and early cultivars with a moderate degree of resistance have been produced. Scott et al. (1982) have also stated that the success of breeding depends more on efficient selection than inadequate sources of resistance.

6.4.
Assessment of disease resistance 6.4.1. Detection of resistance at the adult plant stage under field conditions Quantitative resistance is characterized by continuous variation in sympton expression, and it cannot be separated into clear-cut classes. Consequently, one of the major problems in utilizing this type of resistance is to find sensitive screening techniques in order to detect genetic differences from environmental factors. Rosielle and Brown (1980) suggest that there are three mechanisms of resistance operating in the S. nodorum-wheat system: escape mechanisms, true resistance, and tolerance. It is obvious that very often the resistance to S. nodorum is masked by different escape mechanisms (Fried and Brönnimann 1982). For example, under field conditions late cultivars often appear to be more resistant than early ones (Karjalainen et ai. 1983). However, in years with a rainy late summer susceptible late cultivars may suffer great losses because their grain filling process is slow and thus sensitive to late attacks by S. nodorum with the consequence of heavy reduction in yield.
Several screening techniques are used in breeding wheat for resistance to S. nodorum, but often many of these methods have given different results even though the same lines or cultivars have been tested (Rufty et al. 1981 a). During the last ten years extensive work has been focused on improving the reliability of methods for testing the reactions of wheat lines and cultivars to S. nodorum. The central idea of the screening method is based on the fact that the life cycle of S. nodorum requires humid conditions. The release and dispersal of conidia and the germination and growth of germ tubes requires a wet leaf surface (Shipton et al. 1971, Holmes and Colhoun 1971, Eyal et al. 1977, Jeger et al. 1981. Frequent rains associated with moderate temperatures, that is 12-25°C, provide favourable conditions for disease build-up in wheat fields (Polley andClarkson 1978, Eyal 1981). Furthermore, S. nodorum is mainly spread by rain-splashed drops and inoculum sources within a crop are obviously more important than external sources , Jenkyn and King 1977, Jeger 1983. Thus it follows that under natural conditions the disease is not often evenly spread over the experimental field. Particularly under northern conditions weather conditions are not favourable every year for disease incidence, and in order to meet the requirements for efficient resistance breeding, it is necessary to use artificial inoculations to test breeding lines and populations.
Breeding lines and cultivars as well as segregation populations can be tested either at the seedling plant stage or at the mature plant stage. The advantage of using mature plant tests is based on the relevant information (e.g. Brönnimann 1968) that S. nodorum causes most damages at later growth stages, and several other factors, such as height and maturation time, also affect the expression of resistance. Moreover, the evaluation of resistance at the mature plant stage enables the scrutinization of other breeding objectives at the same time. In the present investigation the validity of the field screening method developed by Scott and Benedikz (1977) was tested, and it was applied to reveal the resistance of Finnish cultivars and foreign sources to S. nodorum.
The results indicate continuous variation in sympton expression (Fig. 26), and all cultivars were infected but to various degrees. This test seems to easily reveal the most resistant cultivars and lines, such as Cl 12463, T. dicoccum, T. timopheevi, Isepton 64, and Isepton 93. It is also apparent on the basis of this test that most Finnish cultivars and breeding lines are fairly susceptible. The most susceptible ones   and Benedikz (1977) appears to be a rapid and reliable way to test breeding lines and cultivars to S. nodorum. It was apparent that in order to obtain reliable results one has to be very careful in spraying inoculum onto plants. To ensure an even spread of inoculum and to promote infection, it is useful to make several successive inoculations. In addition, it is very important to determine the best assessment time as the clear differentiation between cultivars or lines seems to persist for 10-14 days depending on prevailing weather, after which symptoms are masked by rapidly increasing senescence. Although this test reveals easily the most resistant, moderately resistant, and susceptible cultivars, the detection of minor differences between cultivars requires a great number of replications. This is probably due to the fact that this hostpathogen system is sensitive to genotypeenvironment interaction (Mullaney et al. 1982), which can weaken the separation of small differences. Particularly environmental conditions following inoculation affect subsequent disease development to a great extent. For example, an exceptionally high temperature after inoculation can mask the heritable differences of symptom expression between cultivars (Karjalainen et ai. 1983).
Because of these environmentally induced changes in cultivar differences, Fried and Brönnimann (1982) suggest that comparisons between cultivars should be made only if they were inoculated on the same day since in this way the differences caused by changes in environmental conditions can be avoided. However, the simultaneous inoculation can be problematic because it has been recently shown that tissue susceptibility is also dependent on the growth stage of the plant (King et al. 1983, Sharma and Brown 1983, J. Jönsson 1983. There are some data which suggest that tissue susceptibility increases with age so that it is at maximum at heading/flowering (Brönnimann 1968). In the present study, figures 27 and 28 clearly indicate that late cultivars Kadett and Tähti are more resistant than the early ones at early observation times, but at the latter half of the season susceptible cultivars can be severely diseased.
The present artificial inoculation method, followed by covering the plots with plastic bags, raises some important questions for wheat breeding. This kind of method is rather laborious and time consuming, which implies that only a relatively small number of lines can be tested on limited resources. The plastic bags, in particular, cause a lot of extra work.
In addition, under plastic bags the normal plant growth may be disturbed, which may make it difficult to evaluate other breeding objectives at the same time. Recently, these problems have been solved using fair repeated irrigation to provide humid conditions and by making several inoculations to ensure favourable disease development (P.R. Scott and P.W. Benedikz 1982, pers. comm.).
This permits efficient evaluation of many characters simultaneously and reduces the time needed for plastic bag work. The data reported in this investigation reveal some important aspects for practical resistance breeding. For example, it is evident that there is a fairly high level of resistance to S. nodorum in wheat germplasm. However, the most resistant ones are wild Triticum species T. timopheevi and T. dicoccum thus confirming previous studies (Krupinsky et al. 1972, Tomerlin et al. 1984, as well as some other varieties such as Cl 12463, which was very tall and late maturing under Finnish conditions. Unfortunately, this kind of material is difficult to utilize in practical wheat breeding because the resistance genes from these sources are likely to be associated with many agronomically undesirable characters. However, it is clear from this study that a moderate level of resistance with suitable agronomic background is available and can easily be utilized in Finnish spring wheat breeding (Fig. 26)  Therefore, in the last few years more attention has been focused on the possibility of testing seedlings in the glasshouse or laboratory in order to evaluate cultivars and breeding lines to S. nodorum (Pirson 1960, Kietreiber 1962, Eyal and Scharen 1977, Benedikz et al. 1981.
At present two methods are used in testing wheat seedlings to infection by S. nodorum. Seedlings can be inoculated by spraying them with spore suspension at two -three leaf stage, or detached seedling leaves can be mounted on benzimidazole agar with a drop of inoculum placed on the centre of the piece. Hitherto, only a few studies have been carried out (Benedikz et al. 1981, Rufty et al. 1981 to evaluate how seedling tests predict the field performance of adult plant resistance. In order to obtain more information on this, seedling tests were started in 1982 to provide material for comparisons between field tests and seedling tests. In the first investigation, seedlings were inoculated in glasshouse at two leaf stage and covered with a plastic tent for providing high humidity for disease development. The data suggest that the symptoms were clearly visible three days after inoculation, and the best time to differentiate the cultivars for their reactions to S. nodorum appeared to be 6-B days after inoculation. The results indicated (Fig. 29) that the most resistant lines/ cultivars were T. timopheevi, T. dicoccum, and Cl 12463, while amongst the most susceptible ones were Jo 8259, Jo 8292, Jo 8187, Tähti, Hja 21485, and Hja's Ulla. The seedling test also revealed that such cultivars and lines as Kadett, Drabant, Norröna, Hja's Tapio, Hja 21182, and 80325 have a moderate level of resistance to S. nodorum. An important aspect is that this test reveals easily the most susceptible and the most resistant ones as indicated the field test (Fig. 30). The overall correlation between this seedling test and the field test appears to be very high (r = 0.82, p < 0.001), and the seedling test thus seems to predict the field performance of the cultivars quite well. However, some cultivars clearly deviate from this general pattern (Fig. 30). For example, the cultivar Calanda appeared more resistant in the field than at the seedling level.
In the second investigation detached seedling leaves were mounted on benzimidazole agar according to Benedikz et al. (1981) to evaluate the seedling reaction of wheat cultivars to S. nodorum. Following drop inoculation, the symptoms were clearly visible four days after inoculation, and the cultivars were best differentiated to the reaction by the disease B -l o 8 -10 days after inoculation. The data reveal (Fig. 31) that T. timopheevi, T. dicoccum, and Cl 12463 were again the most resistant entries, while Tähti, Hja 21485, Hja 46592, Hja's Taava, and Jo 8219 were among the most susceptible ones. It was also clear that such cultivars as Kadett, Drabant, and Norröna, among others, indicate a moderate level of resistance to S. nodorum at the seedling stage. The overall correlation between the detached seedling leaf test and the field test is moderately high (r = 0.63, p < 0.001) indicating that the detached seedling leaf test predicts the field performance quite well. However, some cultivars again deviate greatly from this general pattern (Fig. 32). For example, Calanda and 80325 were more resistant in the field test than in the detached seedling leaf test, while Jo 8187, Jo 8275, and Hja 21600 appeared more susceptible in the field test.
The seedling tests evaluated in this study provide important information on the general value of seedling tests to predict the field performance of wheat cultivars to reaction by S. nodorum. The two seedling tests are quite well mutually related, as revealed by the high correlation coefficient (r = 0.79, p < 0.001). However, there were some deviations which need further clarification. Those deviations may be due to some extent benzimidazole agar effect, since benzimidazole is a fungicide and can somehow affect fungal growth in the leaf, as already noted by Übels (1979). It is also evident that the development of infection is slower in detached leaves mounted on benzimidazole agar than on attached leaves, and attached leaves can in some cases provide a wider range of variation in sympton expression than the detached leaves thus making cultivar comparisons easier. However, in general the visual assessment of seedling plants appears to be more liable to estimation errors (Koch and Hau 1980)  seedling leaf test because the inoculum drop can be exactly measured and placed on the centre of leaf pieces. However, there are increasing data (Eyal andScharen 1977, Scharen andEyal 1980) which clearly indicate that the difficulties concerning spraying techniques can be overcome by using quantitative inoculation methods, and there is every reason to assume that an experienced researcher can rapidly and reliably visually evaluate cultivar differences. The method developed by Eyal and Scharen (1977) provides an efficient technique to evaluate the reactions of cultivars and breeding lines to S. nodorum at the seedling stage. Recently, Rufty et al. (1981 a) indicated that seedling tests based on the Eyal and Scharen (1977) method can predict reliably the field reaction of cultivars to S. nodo-rum. In addition, this quantitative method and its modifications can be used in screening segregation populations, and Scharen and Krupinsky (1978) have been able to improve significantly the level of resistance to S. nodorum using the above seedling test by Eyal and Scharen (1977). On the other hand, it seems evident that the use of detached seedling leaves should be limited to parallel use with field tests in assessing the reactions of cultivars and breeding lines to S. nodorum.
The previous data (Karjalainen 1984 VI) as well as studies by Benedikz et al. (1981) (Bingham 1981 a, Lupton 1982), and comparisons between old and modern varieties show that about half of this increase is due to the work of plant breeders. Major part of yield increase is explained by the increase in harvest index (Austin et al. 1980). The above yield comparisons were made under conditions where disease attacks and lodging were prevented. However, it is evident that plant breeding has also significantly improved straw stiffness, baking quality, and disease resistance. Kivi (1963) showed that Finnish plant breeding has also made significant contributions by improving cereal yields, and there is no doubt about the better yielding ability of current spring wheat varieties compared with the old ones, although such comparisons are not available.
Wheat breeding for disease resistance has also been successful in many countries (Lelley 1976, Bingham 1981 b). Progress has almost always been faster if the resistance genes are of large effect and dominant (Day 1984 b) like, for example, the breeding against rusts and mildew. However, the past history of using dominant single genes for disease control also provides examples of failures when new physiological races capable of breaking host resistance have emerged (Kiyosawa 1982). Consequently, we are now aware of the importance of the structure and dynamics of pathogen populations for breeding single gene resistance to achieve long lasting disease control (Wolfe et al. 1983, Leonard 1984, Wolfe 1984.
Progress in wheat breeding for resistance to S. nodorum has not been remarkably fast.
The main reason for this is the quantitative nature of the resistance. Hence the detection and incorporation of resistance into new breeding lines is difficult (Brönnimann 1982, Fried andBrönnimann 1982). Similar problems are faced in wheat breeding for quantitative resistance to mildew, and Bennett (1984) concludes that the present methods of testing quantitative resistance are labour intensive and inappropriate as selection criteria for breeding programmes. However, practical plant breeding has succeeded in improving the level of Septoria resistance (Scott et al. 1982), and Doodson (1981) has shown the significant economic benefit of cultivating resistant wheats to control Septoria diseases.
The data presented in this study aimed at providing relevant information for the intensification of Finnish wheat breeding for resistance to S. nodorum. The strategy of wheat breeding is to detect the rare plants with an improved combination of characters (Lelley 1976, Bingham 1979). Thus as the breeder's task increases in complexity, the more plants must be grown for selection in segregation populations (Day 1984 b), and the more difficult it is to find rare plants with an improved combination of traits such as yield, quality, straw stiffness, earliness, and disease resistance. Consequently, it is necessary to assess the relative economic importance of the diseases likely to be encountered by a new variety. The present study suggests that S. nodorum is capable of causing significant yield reduction even at a moderate level of infection, and under high disease conditions the reduction in yield can be very great. It also seems probable that S. nodorum can reduce wheat quality because hectolitre weight was reduced, and the reduction in grain weight may have some other consequences, for example reduced loaf volume. Mäkelä (1975) indicated that S. nodorum was one of the major pathogens of wheat, and later studies (Avikainen and Hollo 1985, Karjalainen unpublished) suggest that it is frequently found in the wheat growing areas of Finland. Thus there is every reason to assume that in long term S. nodorum reduces yields to such extent that preventing losses can stabilize Finnish wheat production. Host resistance to control damages caused by S. nodorum seems to be economically the most attractive method (Doodson 1981).
Seeking sources of resistance is the first step in incorporating genetic resistance into crop plants. The present study as well as other data , Tomerlin et al. 1984 suggest that an adequate level of resistance is available in the wheat germplasm to enable successful resistance breeding.
However, it appears that the highest level of resistance is found in wild Triticum species and a number of lines and cultivars which are very tall and late maturing. Thus the incorporation of high level resistance into well adapted wheat backgrounds should be one of the primary objectives of wheat breeding programmes for resistance to S. nodorum.
It is not reasonable to use these high level sources in practical wheat breeding unless their resistance genes have been incorporated into adapted wheat backgrounds. However, breeders can currently use a number of lines and cultivars, for example those Nordic va-rieties which have a moderate level of Septoria resistance.
The success in breeding for resistance to S. nodorum depends greatly on efficient methods of detecting resistant plants during the various phases of breeding. Following crossing and F, self-pollination, genetic segregation generates diverse populations in F 2, where visual selection on a single plant basis for high heritability characters such as agronomic type and disease resistance is practiced (Bingham 1979(Bingham , 1981. The present inheritance studies suggest that the heredity component of resistance is high enough to permit efficient selection work, but also suggest that breeding success depends greatly on sensitive methods of finding resistant plants in F 2 populations. Efficient screening of segregation populations for resistance to S. nodorum can be done using alternative techniques. The advantage on inoculating plants with spore suspension at the seedling plant stage is that in this way the influence of growth stage on symptom expression can be avoided. However, inoculating plants at the mature plant stage is supported by the idea (Brönnimann 1968, Scott andBenedikz 1977) that host tissue is most susceptible at heading and flowering phase, and during this time it is easy to take other breeding objectives into account. It seems evident that if inoculations are made before flag leaf emergence or soon after that, the genetic differences in symptom expression will be clear enough to permit efficient selection. Experience of screening work suggests that in order to detect resistant recombinants in progenies derived from moderately resistant crosses, it is particularly important to make successive inoculations and to provide humid conditions to ensure favourable disease development. Assessments based on visual estimations of percent area of diseased leaf are most convenient in practical breeding since such criteria as the number of lesions or the amount of necrosis (Scharen and Krupinsky 1978) are too labour intensive as well as utilizing differences in latent period or other components of partial resistance (Jeger et al. 1983).
F 3 selections are based on observations made on separately grown ear-row plots. Screening for resistance to S. nodorum is done mainly in fields, but a limited number of progenies can be tested in glasshouse using seedling inoculation developed by Eyal and Scharen (1977). In view of practical breeding it is important that desired resistant and agronomically adapted plants can be tested in this generation for protein quality using, for example, the sodium dodecyl sulphate (SDS) precipitation test (Blackman and Gill 1980). This method is very rapid and requires a small sample size, and it appears to be a good measure of inherent protein quality which correlates well with baking quality (Payne et al. 1979(Payne et al. , 1980. Testing breeding lines and cultivars for resistance to S. nodorum can be done both in the field and in the glasshouse. The present study suggests that the field method developed by Scott and Benedikz (1977) is efficient in testing mature plant resistance. A particularly important piece of information from the present studies was that seedling techniques based on detached seedling leaves mounted on benzimidazole agar and attached seedling leaves can predict the field reaction of most cultivars quite well (see also Rufty et al. 1981), and thus they may considerably help the testing of breeding lines and cultivars because the tests can be carried out during the winter period.
Resistance to S. nodorum appears to be durable since there are no reports of physiological races capable of breaking host resistance despite the widespread cultivation of resistant varieties (Scott et al. 1982, King et al. 1983). However, S. nodorum is a highly variable pathogen. For example, Scharen and Krupinsky (1970) and Griffiths and Ao (1980) found a great variability in spoliation, colony morphology, and aggressiveness among single conidium isolates taken from a single pycnidium. The genetic basis of the variation in S. nodorum is poorly understood because the sexual stage occurs rarely in nature and has not been induced in artificial culture (Griffiths and Ao 1980). Variation may be due to heterokaryosis, parasexual recombination (Griffiths and Ao 1980) or it may be associated with the presence of mycoviruses or plasmids which have been recently found to cause variation in aggressiveness in other fungi (Hollings 1982, Hashiba et al. 1984. Hitherto, available evidence suggests (Griffiths and Ao 1980, Allingham and Jackson 1981, Rufty et al. 1981b, King et al. 1983) that cultivar resistance to S. nodorum is non-specific and despite the fact that isolates differ in aggressiveness, they do not differ in the range of cultivars attacked.
Whether modern research tools such as recombinant DNA technology can provide novel avenues for speeding up the development of resistant cultivars is a question of much interest. During the last five years remarkable progress has been made in developing transformation systems for plants (Caplan et al. 1983, Hull 1983, Murai et al. 1983, Paszkowski et al. 1984. Transformation systems based on Ti plasmids of Agrobacterium tumefaciens are nearly ready to introduce purified DNA sequences into plants (De Block et al. 1984, Matzke et al. 1984. Unfortunately, these systems can be applied to a limited range of crop plants only, and no vectors for cereals other than maize are to be expected in the near future. Thus, it appears that for many years to come conventional breeding methods will play a major role in developing resistant wheat varieties. Further, since only single gene traits can be improved by genetic engineering Stalker 1984, Day 1984 c), it is probable that its impact on wheat breeding will be marginal because the most important characters are controlled by many genes.
However, novel avenues for controlling diseases like S. nodorum may open in the far future when the nature of disease process is well understood. For example, current research Köller 1983, Soliday et ai. 1984) is trying to reveal the nature of the first step in infection, and the first molecular interaction between the host and the pathogen, the outcome of which determine the disease reaction. The present ultrastructural observations which have shown that the penetration by S. nodorum takes place directly through intact cell surfaces may be important for future studies. Understanding the early phases of infection and the nature of defence mechanisms could open novel avenues such as developing antipenetrants for disease control (Maiti and Kolattukudy 1979). So far, new approaches to control S. nodorum appear to be only theoretical, and the present knowledge suggests that the best way of controlling the disease caused by S. nodorum is integrated control using seed treatment, crop rotation and resistant cultivars as well as under high disease conditions sprayings with fungicides. Tutkimuksessa tarkastellaan Septoria nodorum Berk, -sientä, joka aiheuttaa kellanruskeita laikkuja vehnän lehtiin ja varsiin sekä ruskettaa tähkän kaleita. Viime vuosina Septoria-tauti on lisääntynyt eri puolilla maailmaa ja aiheuttanut tuntuvia satotappioita. Suomessa sitä esiintyy koko vehnän viljelyalueella, ja sateisina kasvukausina sen lisääntyminen on ollut silmiinpistävää.