The National Statistics System: Our Challenge Presented by Statistics South Africa




старонка3/4
Дата канвертавання24.04.2016
Памер112.47 Kb.
1   2   3   4

6.8 Canada

Canada is one of the top four agencies renowned for best practice in the statistics globe with extensive international development programme. Under the Statistics Act, Statistics Canada is required to "collect, compile, analyse, abstract and publish statistical information relating to he commercial, industrial, financial, social, economic and general activities and conditions of the people of Canada." Statistics Canada prides itself of two main objectives: Firstly, to provide statistical information and analysis about Canada’s economic and social structure by developing and evaluating public policies and programs thereby improve public and private decision-making for the benefit of all Canadians.


Secondly, to promote sound statistical standards and practices by using common concepts and classifications to provide better quality data. Through working with the provinces and territories Statistics Canada, aims to achieve greater efficiency in data collection and less duplication. Furthermore they aim at reducing the burden on respondents through greater use of data sharing agreements (sources used include annual tax records, monthly employee payroll records and customs records) improving statistical methods and systems through joint research studies and projects. In addition to bringing out about 350 active surveys on average per year, Statistics Canada brings statistics to life through a variety of innovative programmes in schools amongst many other methods.

Furthermore Statistics Canada, operates within the context of portals or clusters rather than through government departments. The challenge for the agency is how to structure the design of these portals such that they interface seamlessly with the data holdings of Statistics Canada.


On the issue of methods and standards the Chief Statistician in Canada had this to say

“Recently the news media have provided increasing coverage of Statistics Canada's low income cut-offs and their relationship to the measurement of poverty. At the heart of the debate is the use of the low income cut-offs as poverty lines, even though Statistics Canada has clearly stated, since their publication began over 25 years ago, that they are not. The high profile recently given to this issue has presented Statistics Canada with a welcome opportunity to restate its position on these issues. Many individuals and organizations both in Canada and abroad understandably want to know how many people and families live in "poverty", and how these levels change. Reflecting this need, different groups have at different times developed various measures which purported to divide the population into those who were poor and those who were not. In spite of these efforts, there is still no internationally-accepted definition of poverty - unlike measures such as employment, unemployment, gross domestic product, consumer prices, international trade and so on. This is not surprising, perhaps, given the absence of an international consensus on what poverty is and how it should be measured. Such consensus preceded the development of all other international standards. The lack of an internationally-accepted definition has also reflected indecision as to whether an international standard definition should allow comparisons of well-being across countries compared to some international norm, or whether poverty lines should be established according to the norms within each country.


The proposed poverty lines have included, among others, relative measures (you are poor if your means are small compared to others in your population) and absolute measures (you are poor if you lack the means to buy a specified basket of goods and services designated as essential).
Both approaches involve judgmental and, hence, ultimately arbitrary choices. In the case of the relative approach, the fundamental decision is what fraction of the overall average or median income constitutes poverty. Is it one-half, one-third, or some other proportion? In the case of the absolute approach, the number of individual judgements required to arrive at a poverty line is far larger. Before anyone can calculate the minimum income needed to purchase the "necessities" of life, they must decide what constitutes a "necessity" in food, clothing, shelter and a multitude of other purchases, from transportation to reading material.
The underlying difficulty is due to the fact that poverty is intrinsically a question of social consensus, at a given point in time and in the context of a given country. Someone acceptably well off in terms of the standards in a developing country might well be considered desperately poor in Canada. And even within the same country, the outlook changes over time. A standard of living considered as acceptable in the previous century might well be viewed with abhorrence today.
It is through the political process that democratic societies achieve social consensus in domains that are intrinsically judgmental. The exercise of such value judgements is certainly not the proper role of Canada's national statistical agency which prides itself on its objectivity, and whose credibility depends on the exercise of that objectivity.

In Canada, the Federal/Provincial/Territorial Working Group on Social Development Research and Information was established to create a method of defining and measuring poverty. This group, created by Human Resources Development Canada and social services ministers in the various jurisdictions, has proposed a preliminary market basket measure of poverty - a basket of market-priced goods and services. The poverty line would be based on the income needed to purchase the items in the basket. Once governments establish a definition, Statistics Canada will endeavour to estimate the number of people who are poor according to that definition. Certainly that is a task in line with its mandate and its objective approach. In the meantime, Statistics Canada does not and cannot measure the level of "poverty" in Canada.


For many years, Statistics Canada has published a set of measures called the low income cut-offs. We regularly and consistently emphasize that these are quite different from measures of poverty. They reflect a well-defined methodology which identifies those who are substantially worse off than the average. Of course, being significantly worse off than the average does not necessarily mean that one is poor.
Nevertheless, in the absence of an accepted definition of poverty, these statistics have been used by many analysts to study the characteristics of the relatively worst off families in Canada. These measures have enabled us to report important trends, such as the changing composition of this group over time. For example, 20 to 30 years ago the elderly were by far the largest group within the "low income" category, while more recently lone-parent families headed by women have grown in significance.
Many people both inside and outside government have found these and other insights to be useful. As a result, when Statistics Canada carried out a wide-ranging public consultation a decade ago, we were almost unanimously urged to continue to publish our low income analyses. Furthermore, in the absence of a generally accepted alternative methodology, the majority of those consulted urged us to continue to use our present definitions. In the absence of politically-sanctioned social consensus on who should be regarded as "poor", some people and groups have been using the Statistics Canada low-income lines as a de facto definition of poverty. As long as that represents their own considered opinion of how poverty should be defined in Canada, we have no quarrel with them: all of us are free to have our own views. But they certainly do not represent Statistics Canada's views about how poverty should be defined.”10 Italics and bold are the author’s.
6.9 Korea
The National Statistics Office of Korea was established by Act in 1948 when the Korean government was first established. The office then called the Bureau consisted of one officer and 4 divisions: General Affairs, Planning, Population Census and Vital Statistics. On the 13th of December 1948, the first Population Census was declared by the 39th presidential decree. This was the first administrative order the Korean government ever made in relation to statistical policy. After the Korean War, the Constitution was reviewed and complied with the introduction of the Market Economic System. This implied far-reaching changes, and the Bureau of Statistics was transferred to the Ministry of Home Affairs as a result in 1955 and there was further re-organisation and the Bureau was reconstituted into 3 Divisions of Planing, Population Census and Vital Statistics. General Affairs had disappeared. Six years later in 1961, the Economic Planning Board (EPB) was launched, and the Bureau of Statistics was relocated to EPB from the Ministry of Home Affairs. An additional branch was added namely, the Division of Survey and Analysis, this was in 1962. By the end of 1963, the Bureau of Statistics was renamed as the National Bureau of Statistics (NBOS). At the same time, the NBOS - now as an external bureau of EPB - was restructured into the 4 Divisions of Statistical Planing, Statistical Standards, Population Statistics, and Economic Statistics. As a consequence of this restructuring and its location in EBP, the NBOS expanded several times. During this period, the NBOS had played a vital role of providing the basic data required for formulating and evaluating economic development policies.
Upon recognising the increasing importance of producing fundamental statistics and coordinating national statistical services, the National Bureau of Statistics was again increased in size and promoted to a status of vice minister administratively, like is the case in Mozambique. At the same time it was consequently renamed the National Statistical Office (NSO) in December 1990. The NSO now included 3 bureaus, 14 divisions, 11 statistical offices and 15 local branch offices.
To address training needs in September 1991, the Statistical Training Center was set up for the purpose of producing lots of professional experts for statistics. In February 1995, two Divisions were established namely Statistical Information and International Statistics which aimed at improving the quality of statistical information services and fulfilling ever-increasing demands on international statistical services.
In September 1996, the position of the Statistical Examiner was created, and the Division of Population Statistics was expanded into 2 Divisions of Population Census and Vital Statistics. In addition, the Division of Statistical Research was also created.
Currently the NSO has 4 bureaux, 19 divisions, 12 statistical offices and 35 local branch offices and 1 training centre.
The work on social indicators in Korea was initiated thirty years ago in 1972. As a result of this process Korea had by 1978 three hundred and fifty across eight areas of social concern. By 1987, in response to changing economic conditions, Korea extended its indicator profile to 468, covering nine areas of social concern. As a result of continuous improvements Korea now covers 553 indicators covering 13 areas of social concern. Korea is at the forefront of measuring information and communications technology (ICT) as part of the indicator package.
6.10 South Africa
“The world-wide movement to transform the way government goes about its business has greatly influenced the Government of South Africa. The government has sought to place at the forefront of governance accountability to the citizenry of the country, effective and efficient management; delivery of services and “business” principles of business planning and goal attainment as the way it runs South Africa.”11
6.10.1 Background
To understand the trajectory of the development of statistical systems in South Africa, this background is appropriate. The development of the statistical system in South Africa has often been rooted in the underlying political system. Mclennan confirms this observation by suggesting that “the evolution of official statistics in each country is mainly a product of its history.”12 Although South Africa on the African continent has had one of the longest formal history of statistical development, its path has been largely a sorry one. In general, the system has been an undernourished one, characterised by extreme and rapidly growing fragmentation. This was particularly so in the period from 1970 to 1994. This was consistent with the implementation of grand apartheid. Prior to this period, the system reflected the racial tensions inherent in the formative years of apartheid. The post liberation period has been marked by attempts to consolidate the system but lack of vision and strategy undermined attempts for operational effectiveness. This meeting co-hosted by The Presidency, Statistics South Africa and PARIS21 allows for a strategic intercourse for statistical development. It is the first deliberate and strategic attempt for creating an NSS to inform a democratic dispensation.
6.10.2 Fragmentation of the system under apartheid
From 1970 to 1994 the system had been fragmented and was dominated by six distinct forces. Each of these had its sphere of geographic and/or thematic influence.


  • The CSS predominantly focused on Whites as a population group and the economy of the geographic component referred to as the erstwhile White South Africa. This left the economy of the Black population in the then South Africa relatively unknown and hitherto, this is still unexplored in South Africa as currently constituted.

  • The HSRC as a source of official statistical information, focused on the Black population. This was both in the homelands and South Africa but largely studied their demographics as a theme.

  • The third force was the DBSA who looked at financing of Black development and as an information source, they predominantly acted in the homelands. There was some conflict and professional jealousy between the erstwhile CSS and the DBSA, particularly over the report on the nine provinces issued in 1994/95.

  • The fourth area was dominated by the academia. They were largely active in demographics and population projections.

  • The fifth force consisted of market researchers and the Bureau of Market Research was the predominant force specialising in particular, in income and expenditure surveys. They were also very active in the homelands.

  • The sixth institution was the statistics offices of the homelands themselves with varying levels of effectiveness. Their focus was largely to inform the sharing of the Customs Revenue Pool and the running of censuses.

The official body, the then CSS, focused on a minority of less than five million as opposed to the forty three million to which Stats SA has to deliver on. Given this history of fragmentation, there is the need for a level headed and deliberate process that aims at systems consolidation at the product, service, solution and institutional level.


6.10.3 The RDP
In an effort to deal with the challenges outlined above, in 1994 Cabinet agreed that every program and project in the government should have a business plan. Such a business plan would stipulate what the government or department seeks to attain, as well as the means of doing so. Included in the package were cost structures and costs as well as identifying beneficiaries. More importantly the proposals should define in concrete terms how the project or program is connected to the government overall goals set out in the Reconstruction and Development Programme (RDP) white paper, the Constitution, and other legislative arrangements and priorities.
Quite clearly, complying with the Cabinet decision implied having a battery of information at different layers of government, geography and at different levels of program, project, and or activity. The test for implementation was how government departments would respond to the first hundred days, which amongst others identified the following goals: Access to medical care for all pregnant women, universal education for children of school going age, feeding scheme at primary schools, electrification of areas that did not have electricity and provision of water and sanitation. Furthermore, the building of houses with the aim of reaching a million within five years was just about being witnessed.
Ambitious and necessary was the programme. It was launched by the State President with the aim of having a high profile impact while noting that addressing the inequities of the past would take a while. The programme marked the ushering of a new government that also focused on more profound long range plans that would come on line in subsequent years. The officials were quite eager to plan and implement but were confronted with several challenges.
6.10.4 Information and Data challenges
The first challenge for the South African government official was the identification of these “scientifically” valid measures and the development of some socio-political and business consensus of their utility as acceptable measures of phenomena in question. The second problem was lack of information and data that could lend itself to enable planning at sub-national levels, i.e., at the local level. The local economic plans lacked basic data and this made planning very difficult to undertake, beyond mere statements of intent. Faced with not only an unsympathetic bureaucracy, but one which had no vision on the information needs of society made the work of the new political and bureaucratic incumbents unenviable. Fellegi commenting on systems of measurement and indicators notes that “It is through the political process that democratic societies achieve social consensus in indicator domains that are intrinsically judgmental.”13 Italics are the author’s.
6.10.5 Limitations of attempts for meeting data and indicator needs
“Central to assessing the business plan and evaluating the extent to which the government in general, department and sub units of department in particular attain their goals simply whether they do what they are supposed to do, a system of accountability was proposed and agreed upon, which entail: the identification of measurable goals; the identification of measurement or indicators of success or failure in attainment of the goals; and some way of measuring the extent, prevalence, dimensions, and growth and consequence of a particular problem such as poverty or ill health, unemployment, crime rate, etc.”14

6.10.5.1 Training for technology instead
Early attempts to address not only indicators and their systems, but more importantly, rudimentary data needs, focused on information technology instead of dealing with substantive content issues that address systems of indicators. These issues would be amongst others, the choice of indicators, what they are, how should they be compiled, what quality and methodological principles should be observed, periodicity of their generation, institutional arrangements for their compilation, infrastructure required, funding considerations and building consensus on them. It therefore came as no surprise that this gallant effort by the officials in the RDP office never generated the indicator framework it was supposed to. A training programme whose initial cost estimates ran to about R9 million was undertaken by ESKOM for officials of government and CASE tools were applied by King information consultants who taught participants one-to-many and many-to-one, how to develop relational databases, the importance of Joint Application Development (JAD) and object orientation. This training while important in its own right for a different audience indeed missed point.
6.10.5.2 Good idea bad timing
With the benefit of hindsight, one can argue that not only was this approach unwise, its timing was wrong. The approach was unwise in that it did not start with what the data needs of government for development were. Instead it focused on the electronic management of things that did not conceptually exist or had not been adequately discussed and understood. The implementation process of the training was also fraught with timing difficulties. Staff who attended what largely appeared to be addressing technology, were trained without any possibilities of guarantees that they were the most likely to be in the planning and monitoring offices of the new government. At the time the training was implemented the restructuring had just begun in earnest. Uncertainty was running rampant both amongst the old guard who were seeing their way out and negotiating packages and golden hand shakes, and the new bureaucrats who came in, sure of a place but not certain about the area they would occupy. The Statistics organisation, which supposedly is the main cradle for development information was in bad shape and intransigent to change, did not even participate in the training.
6.10.5.3 Nomenclature: Talking past one another
The third level of problems was the nomenclature and use of terminology, which on the face of it may appear to be trivia, especially challenged by major development issues of brick and mortar, medicines and food terminology and language, become the last thing to argue about. Mokaba notes that “most of the confusion in government language between what is a project, program or simply a task emanate from the multiplicity of levels at which each of the speakers may be operating from. This was clearly indicated by the confusion around the Katorus Presidential Lead Project or program. To the province and the local unit the Katorus operation was a R600 million program consisting of various elements of capacity building, infrastructure development, local government, and safety and security. Bu to the office of the president and to the minister without portfolio and surely for the president, Katorus was just a project, one of the projects that comprise the total urban redevelopment program.”15

Perhaps even more confusion about the empirical referent of the word program in the GNU is the use of the term in budgetary process to refer to the health services of South Africa as a sectoral program and the Reconstruction and Development Program or the National Works Program.”16


6.10.5.4 Subtle turf battles and procrastination
The fourth level problem consisted of subtle battles over turf, posturing and thereby precipitating fragmentation by the information producers. While the new government noted that there was serious paucity of information, there was not a visible effort to prioritise statistics and possibly improve the situation, instead private sector initiatives and individual departmental efforts helplessly attempted to address these data and information requirements and needs. The Central Statistical Service, like the Human Sciences Research Council, for historical reasons found themselves as passive observers in the scramble for providing information. Amongst the more active in the information arena, were the CSIR, who supplied the data for what was to be popularly coined as spatial development initiatives (SDI’s) which identified corridors with development potential. Another major source of information was the SALDRU study, which provided socio-economic data but much more at a macro level while data requirements for planning in the instance of the RDP, are often times needed at a more local level. The DBSA at the time had produced an information document on the nine provinces and was playing an active role as an information supplier. The Central Statistics Services came rather late to the party and was less of a welcomed member because it indeed was intransigent and or lacked the profound understanding and appreciation of the central role it had to play in a changing society. In direct competition with the DBSA, they produced provincial profiles. The stage was adequately muddied by bits of inadequate information and posturing in the information arena.
1   2   3   4


База данных защищена авторским правом ©shkola.of.by 2016
звярнуцца да адміністрацыі

    Галоўная старонка